Enum Class HateSpeechAnalyzer

java.lang.Object
java.lang.Enum<HateSpeechAnalyzer>
ch.privately.owas.nlp.HateSpeechAnalyzer
All Implemented Interfaces:
Serializable, Comparable<HateSpeechAnalyzer>, Constable

public enum HateSpeechAnalyzer extends Enum<HateSpeechAnalyzer>
 Class to analyze a text for hate speech content.

 It is a good idea to test that the model is loaded before calling analyzeText().
 A simple use-case to test a text for hate speech would be:


     public float analyzeForHateSpeech(String text) {
         if (HateSpeechAnalyzer.INSTANCE.isLoaded()) {
             return HateSpeechAnalyzer.INSTANCE.analyzeText(text);
         }
         return 0.0f;
     }
 
  • Enum Constant Details

  • Method Details

    • values

      public static HateSpeechAnalyzer[] values()
      Returns an array containing the constants of this enum class, in the order they are declared.
      Returns:
      an array containing the constants of this enum class, in the order they are declared
    • valueOf

      public static HateSpeechAnalyzer valueOf(String name)
      Returns the enum constant of this class with the specified name. The string must match exactly an identifier used to declare an enum constant in this class. (Extraneous whitespace characters are not permitted.)
      Parameters:
      name - the name of the enum constant to be returned.
      Returns:
      the enum constant with the specified name
      Throws:
      IllegalArgumentException - if this enum class has no constant with the specified name
      NullPointerException - if the argument is null
    • init

      public void init(android.content.Context context)
      Initialise the hate speech analyzer by loading the model.
      Parameters:
      context - the context used to load the model.
    • isLoaded

      public boolean isLoaded()
      Returns:
      true if the model is loaded.
    • containsHateSpeech

      public boolean containsHateSpeech(String text)
      Analyze a text for hate speech.
      Parameters:
      text - the text to analyze.
      Returns:
      true if the text contains hate speech, false otherwise.
      Throws:
      IllegalStateException - if the method is called before the class is initialised.
    • analyzeText

      public float analyzeText(String text)
      Analyze a text for hate speech.
      Parameters:
      text - the text to analyze.
      Returns:
      the probability that the input text is hateful, between 0 and 1 (included). Higher value means more hate speech content.
      Throws:
      IllegalStateException - if the method is called before the class is initialised.