The Science

Artificial Intelligence Mental Evaluation

Step 1: Talk with AiME

AiME asks you questions for 20 seconds and observes your responses

Speech Content: your word choice & phrasing are analyzed for sentiment and meaning with an NLP neural network

Facial Expressivity: your facial cues and mannerisms are analyzed using a vision neural network

Vocal Prosody: your inflection, timbre, and pitch changes are analyzed with an acoustic analysis neural network

Step 2: AiME analyzes

AiME uses machine learning to analyze your responses

OBSERVATION

EXTRACT subtle verbal and non verbal responses

IDENTIFY nuanced behavioral patterns over time

MACHINE LEARNING

APPLY neural network technology to observe responses

ANALYZE patterns to map various mental health risk factors

REPORTING

PRODUCE a report 

detailing mental health risk assessments

OFFER objective insights for tracking mental health over time

Step 3: Access your A.I. report 

AiME creates a private and secure report for you

Evaluation is summarized after each AiME exam

Gauges indicate behavior in sync with recorded webcam responses

Video recap for review

Risk of depression, anxiety, and addiction assessed by combining verbal and nonverbal data

Prediction Pulse Reading

of risk levels in real time

Step 4: Track your mental health trends

Your AiME Dashboard allows you to view trends, insights, and track your growth

AiME (/ayy-mee/)

 
 
 
 

Validation

 

Textpert conducted a U.S. study with AiME to train her neural networks and validate the technology. The results were accepted into the APA peer-reviewed journal Psychological Assessment.

AiME's ability to detect depression and addiction is the new gold standard

with 90% sensitivity and 90% specificity on blind validated models

AiME can watch 20 seconds of extemporaneous conversation and assess risks of depression, anxiety, & addiction

1 Terabyte

proprietary data

750 Participants evaluated

Figure 1: Bing sentiment analysis

Figure 1 Bing sentiment analysis assigns each word as positive or negative. People with higher depression risk respond with more "negative" words 

Figure 2: Word variance by PHQ-9

Figure 2 The plot displays the most significant words per PHQ-9 score. Depressed people utilize more ordinary language and overuse the word "really"

Figure 3: Audio encoding

Figure 3 The graph represents various inflection frequencies from just one sentence, digitized and encoded for audio processing. AiME observes inflection variances and analyzes patterns 

Figure 4: Word analysis

Figure 4 AiME analyzes overall sentiment and each word is scored across 250 dimensions. AiME observes patterns and identifies the meaning of each statement

Figure 5: Ensemble neural network

Figure 5 AiME combines predictions using an ensemble neural network — similar to the neural network structure in the human brain.

Inputs from body language, tonality, & speech are analyzed then processed through hidden layers & nodes. AiME identifies the patterns then makes predictions on your mental health 

Read all that? You must like this stuff as much as we do. Send us a note at [email protected] with the phrase "create calm" and get an extra month free

The Science