SDK integration

Install Cocoapods

You need Cocoapods to install the SDK. Follow the installation instructions.

Add dependencies to Podfile

In the Podfile, add the following to any target using the age estimation SDK.

pod 'OwasNlp', '~> 0.1.0';

Perform an image analysis

Make sure you have a set of API key and secret

If you haven't created an account yet, visit Privately's developer website to register for a free trial.

Authenticate the core SDK

if !PrivatelyCore.sharedInstance().isAuthenticated() {
    PrivatelyCore.sharedInstance().authenticate(apiKey: apiKey, apiSecret: apiSecret, callback: { result in
        // Handle success/failure
    })
}

Analyze a text for hate speech

Hate speech analysis is done through the HateSpeechAnalyzer class. First of all make sure the HateSpeechAnalyzer is initialised.

do {
    try HateSpeechAnalyzer.sharedInstance().initManager()

} catch {
    print(error)
}

To find out if a text contains hate speech, use the following call:

let containsHateSpeech = HateSpeechAnalyzer.sharedInstance().containsHateSpeech(text: text)

You can also get the underlying hate speech score (between 0 and 1) associated with the text by using:

let hateSpeechResult = HateSpeechAnalyzer.sharedInstance().analyze(text: text)

Analyze a text for toxicity

Toxicity analysis is done through the ToxicityAnalyzer class. First of all make sure the ToxicityAnalyzer is initialised.

do {
    try ToxicityAnalyzer.sharedInstance().initManager()

} catch {
    print(error)
}

To find out if a text contains toxicity, use the following call:

let containsToxicity = ToxicityAnalyzer.sharedInstance().containsToxicity(text: text)

You can also get the underlying toxicity score (between 0 and 1) associated with the text by using:

let toxicityResult = ToxicityAnalyzer.sharedInstance().analyze(text: text)

Analyze a text for profanity

To find out if a text contains profanity, use the following call:

let containsProfanity = ProfanityAnalyzer.sharedInstance().containsProfanity(text: text)

Analyze a text for sensitive information

Sensitive information analysis is done through the SensitiveInfoAnalyzer enum. First of all make sure the SensitiveInfoAnalyzer is initialised.

SensitiveInfoAnalyzer.sharedInstance().initManager()

To find out if a text contains sensitive information (UK address, UK phone number, or email address), use the following call:

let sensitiveInfos = SensitiveInfoAnalyzer.sharedInstance().analyse(text: text)

You can then query SensitiveInfoResult with containsUkPostcode(), containsUkPhoneNumber(), and containsEmailAddress(). You can also use the contains(sensitiveInfo: SensitiveInfoMatch) method or containsAll(sensitiveInfo: Sequence), and even use the asArray() method for more flexibility.