Classes
The following classes are available globally.
-
Main entry point to the SDK. Before using any component, the SDK should be initialised here.
See moreDeclaration
Swift
public class OwasNlpCore
-
Class to extract the sentiments from a text.
It is a good idea to test that the model is loaded before calling analyze(). A simple use case to analyze a text for prevalent emotions would be:
See morefunc classifyText(text: String) -> EmotionResult? { if (EmotionAnalyzer.sharedInstance().isLoaded()) { return EmotionAnalyzer.sharedInstance().analyze(text: text) } return nil }
Declaration
Swift
public class EmotionAnalyzer
-
Class to analyze a text for hate speech content.
It is a good idea to test that the model is loaded before calling analyze(). A simple use-case to test a text for hate speech would be:
See morefunc classifyText(text: String) -> Double? { if (HateSpeechAnalyzer.sharedInstance().isLoaded()) { return HateSpeechAnalyzer.sharedInstance().analyze(text: text) } return nil }
Declaration
Swift
public class HateSpeechAnalyzer
-
Class to analyze a text for hate speech content.
It is a good idea to test that the model is loaded before calling analyze(). A simple use-case to test a text for hate speech would be:
See morefunc classifyText(text: String) -> Bool { if (ProfanityAnalyzer.sharedInstance().isLoaded()) { return ProfanityAnalyzer.sharedInstance().containsProfanity(text: text) } return false }
Declaration
Swift
public class ProfanityAnalyzer
-
Class to analyse a text for sensitive information.
It is a good idea to test that the model is loaded before calling analyseText(). A simple use-case to test a text for sensitive information would be:
See morefunc classifyText(text: String) -> Score? { if (SensitiveInfoAnalyzer.sharedInstance().isLoaded()) { return SensitiveInfoAnalyzer.sharedInstance().analyse(text: text) } return nil }
Declaration
Swift
public class SensitiveInfoAnalyzer
-
Class to analyze a text for toxic content.
It is a good idea to test that the model is loaded before calling analyze(). A simple use-case to get the toxicity of a text would be:
See morefunc classifyText(text: String) -> Score? { if (ToxicityAnalyzer.sharedInstance().isLoaded()) { return ToxicityAnalyzer.sharedInstance().analyze(text: text) } return nil }
Declaration
Swift
public class ToxicityAnalyzer
-
Representation of an emotion classification result. The supported emotions are:
"anger", "fear", "joy", "love", "neutral", "sadness" and "surprise"
See moreDeclaration
Swift
public class EmotionResult : CustomStringConvertible, Codable
-
Result of an analysis by the SensitiveInfoAnalyser.
See moreDeclaration
Swift
public class SensitiveInfoResult : CustomStringConvertible, Codable
-
Result of an analysis by the TriggerWordsAnalyser.
See moreDeclaration
Swift
public class TriggerAnalysisResult : CustomStringConvertible, Codable