HateSpeechAnalyzer

public class HateSpeechAnalyzer

Class to analyze a text for hate speech content.

It is a good idea to test that the model is loaded before calling analyze(). A simple use-case to test a text for hate speech would be:

func classifyText(text: String) -> Double? {
    if (HateSpeechAnalyzer.sharedInstance().isLoaded()) {
        return HateSpeechAnalyzer.sharedInstance().analyze(text: text)
    }
    return nil
}
  • Class function to get the single instance of this class.

    Declaration

    Swift

    public class func sharedInstance() -> HateSpeechAnalyzer

    Return Value

    The singleton instance of the class.

  • Initialise the hate speech analyzer.

    Declaration

    Swift

    public func initManager() throws
  • Check whether the model has been initialised or not.

    Declaration

    Swift

    public func isLoaded() -> Bool

    Return Value

    true if the model is loaded.

  • Analyze a text for hate speech.

    Declaration

    Swift

    public func analyze(text: String) -> Double

    Parameters

    text

    The text to analyze.

    Return Value

    A hate speech score between 0 and 1.

  • Analyze a text for hate speech.

    Declaration

    Swift

    public func containsHateSpeech(text: String) -> Bool

    Parameters

    text

    The text to analyze.

    Return Value

    true if the sentence is hateful.