Enum Class ToxicityAnalyzer

java.lang.Object
java.lang.Enum<ToxicityAnalyzer>
ch.privately.owas.nlp.ToxicityAnalyzer
All Implemented Interfaces:
Serializable, Comparable<ToxicityAnalyzer>, Constable

public enum ToxicityAnalyzer extends Enum<ToxicityAnalyzer>
 Class to analyze a text for toxic content.

 It is a good idea to test that the model is loaded before calling analyzeText().
 A simple use-case to get the toxicity of a text would be:


     public float analyzeForToxicity(String text) {
         if (ToxicityAnalyzer.INSTANCE.isLoaded()) {
             return ToxicityAnalyzer.INSTANCE.analyzeText(text);
         }
         return 0.0f;
     }
 
  • Enum Constant Details

  • Method Details

    • values

      public static ToxicityAnalyzer[] values()
      Returns an array containing the constants of this enum class, in the order they are declared.
      Returns:
      an array containing the constants of this enum class, in the order they are declared
    • valueOf

      public static ToxicityAnalyzer valueOf(String name)
      Returns the enum constant of this class with the specified name. The string must match exactly an identifier used to declare an enum constant in this class. (Extraneous whitespace characters are not permitted.)
      Parameters:
      name - the name of the enum constant to be returned.
      Returns:
      the enum constant with the specified name
      Throws:
      IllegalArgumentException - if this enum class has no constant with the specified name
      NullPointerException - if the argument is null
    • init

      public void init(android.content.Context context)
      Initialise the toxicity analyzer by loading the model.
      Parameters:
      context - The context used to load the model.
    • isLoaded

      public boolean isLoaded()
      Returns:
      true if the model is loaded.
    • analyzeText

      public float analyzeText(String text)
      Analyze a text for toxicity.
      Parameters:
      text - The text to analyze.
      Returns:
      The probability that the input text is toxic, between 0 and 1 (included). Higher value means more toxic content.
      Throws:
      IllegalStateException - if the method is called before the class is initialised.