ToxicityAnalyzer
public class ToxicityAnalyzer
Class to analyze a text for toxic content.
It is a good idea to test that the model is loaded before calling analyze(). A simple use-case to get the toxicity of a text would be:
func classifyText(text: String) -> Score? {
if (ToxicityAnalyzer.sharedInstance().isLoaded()) {
return ToxicityAnalyzer.sharedInstance().analyze(text: text)
}
return nil
}
-
Class function to get the single instance of this class.
Declaration
Swift
public class func sharedInstance() -> ToxicityAnalyzerReturn Value
The singleton instance of the class.
-
Initialise the toxicity analyzer.
Declaration
Swift
public func initManager() throws -
Check whether the model has been initialised or not.
Declaration
Swift
public func isLoaded() -> BoolReturn Value
trueif the model is loaded. -
Analyze a text for toxicity.
Declaration
Swift
public func analyze(text: String) -> DoubleParameters
textThe text to analyze.
Return Value
A toxicity score between 0 and 1.
-
Analyze a text for toxicity.
Declaration
Swift
public func containsToxicity(text: String) -> BoolParameters
textThe text to analyze.
Return Value
trueif the content is toxic
ToxicityAnalyzer Class Reference