ToxicityAnalyzer
public class ToxicityAnalyzer
Class to analyze a text for toxic content.
It is a good idea to test that the model is loaded before calling analyze(). A simple use-case to get the toxicity of a text would be:
func classifyText(text: String) -> Score? {
if (ToxicityAnalyzer.sharedInstance().isLoaded()) {
return ToxicityAnalyzer.sharedInstance().analyze(text: text)
}
return nil
}
-
Class function to get the single instance of this class.
Declaration
Swift
public class func sharedInstance() -> ToxicityAnalyzer
Return Value
The singleton instance of the class.
-
Initialise the toxicity analyzer.
Declaration
Swift
public func initManager() throws
-
Check whether the model has been initialised or not.
Declaration
Swift
public func isLoaded() -> Bool
Return Value
true
if the model is loaded. -
Analyze a text for toxicity.
Declaration
Swift
public func analyze(text: String) -> Double
Parameters
text
The text to analyze.
Return Value
A toxicity score between 0 and 1.
-
Analyze a text for toxicity.
Declaration
Swift
public func containsToxicity(text: String) -> Bool
Parameters
text
The text to analyze.
Return Value
true
if the content is toxic