Classes

The following classes are available globally.

  • Main entry point to the SDK. Before using any component, the SDK should be initialised here.

    See more

    Declaration

    Swift

    public class OwasNlpCore
  • Class to extract the sentiments from a text.

    It is a good idea to test that the model is loaded before calling analyze(). A simple use case to analyze a text for prevalent emotions would be:

    func classifyText(text: String) -> EmotionResult? {
        if (EmotionAnalyzer.sharedInstance().isLoaded()) {
            return EmotionAnalyzer.sharedInstance().analyze(text: text)
        }
        return nil
    }
    
    See more

    Declaration

    Swift

    public class EmotionAnalyzer
  • Class to analyze a text for hate speech content.

    It is a good idea to test that the model is loaded before calling analyze(). A simple use-case to test a text for hate speech would be:

    func classifyText(text: String) -> Double? {
        if (HateSpeechAnalyzer.sharedInstance().isLoaded()) {
            return HateSpeechAnalyzer.sharedInstance().analyze(text: text)
        }
        return nil
    }
    
    See more

    Declaration

    Swift

    public class HateSpeechAnalyzer
  • Class to analyze a text for hate speech content.

    It is a good idea to test that the model is loaded before calling analyze(). A simple use-case to test a text for hate speech would be:

    func classifyText(text: String) -> Bool {
        if (ProfanityAnalyzer.sharedInstance().isLoaded()) {
            return ProfanityAnalyzer.sharedInstance().containsProfanity(text: text)
        }
        return false
    }
    
    See more

    Declaration

    Swift

    public class ProfanityAnalyzer
  • Class to analyse a text for sensitive information.

    It is a good idea to test that the model is loaded before calling analyseText(). A simple use-case to test a text for sensitive information would be:

    func classifyText(text: String) -> Score? {
        if (SensitiveInfoAnalyzer.sharedInstance().isLoaded()) {
            return SensitiveInfoAnalyzer.sharedInstance().analyse(text: text)
        }
        return nil
    }
    
    See more

    Declaration

    Swift

    public class SensitiveInfoAnalyzer
  • Class to analyze a text for toxic content.

    It is a good idea to test that the model is loaded before calling analyze(). A simple use-case to get the toxicity of a text would be:

    func classifyText(text: String) -> Score? {
        if (ToxicityAnalyzer.sharedInstance().isLoaded()) {
            return ToxicityAnalyzer.sharedInstance().analyze(text: text)
        }
        return nil
    }
    
    See more

    Declaration

    Swift

    public class ToxicityAnalyzer
  • Representation of an emotion classification result. The supported emotions are:

    "anger", "fear", "joy", "love", "neutral", "sadness" and "surprise"
    See more

    Declaration

    Swift

    public class EmotionResult : CustomStringConvertible, Codable
  • Result of an analysis by the SensitiveInfoAnalyser.

    See more

    Declaration

    Swift

    public class SensitiveInfoResult : CustomStringConvertible, Codable
  • Result of an analysis by the TriggerWordsAnalyser.

    See more

    Declaration

    Swift

    public class TriggerAnalysisResult : CustomStringConvertible, Codable