Get session data
Provides session state and insights as they become available.
Path Parameters
- sessionId string required
Id of session. Session is created when all participants (provider and clients) join and provider starts the session. Session id is part of the web-hook payload for all session events, e.g. when session is created.
- 200
- 403
- 404
OK
- application/json
- Schema
- Example (from schema)
Schema
- sessionId string required
Id of session.
- noteTemplateIds string[] required
List of note template ids for notes generation for the session.
- date string
Date and time when session was created. ISO 8601 format.
- providerUserId string required
Id of the provider user.
- clientUserIds string[] required
Ids of the client users that participated in the session.
- state TherapySessionStateResponse required
Possible values: [
InProgress,Processing,Done,Error]State of the session.
error object
code string requiredError code identifying the problem. See Error codes.
message string requiredError message.
- recordingDurationMs integer
Duration of the session recording in milliseconds. Value becomes available during session processing.
- recordingUrl uri
URL to the session recording. Url is valid only for a limited time.
transcript object
Transcript of the recorded session. Value becomes available during session processing.
utterances object[]
List of all utterances.
Array [text string requiredText of the utterance.
start integer requiredRelative start time in milliseconds of this utterance in the recording.
end integer requiredRelative end time in milliseconds of this utterance in the recording.
speaker string requiredSpeaker of the utterance. Speakers are labeled with letters, starting with 'A'.
words object[] required
List of words in the utterance.
Array [text string requiredtext of the single word
start integer requiredRelative start time [ms] of this word occurrence in the recording
end integer requiredRelative end time [ms] of this word occurrence in the recording
]]speakerMapping object
Mapping of speakers (present in utterances) to user identifiers. One user identifier can be assigned to multiple speakers.
property name*stringtopics object[]
List of topics discussed in the session.
Array [headline string requiredTopic headline
gist string requiredstart integer requiredRelative start time [ms] of this topic in the recording.
end integer requiredRelative end time [ms] of this topic in the recording.
]namedEntities object[]
List of named entities detected in transcript with their description.
Array [entityType string requiredType of detected entity. E.g. of "medical_condition", "organisation", "location", "person_name"
name string requiredsummary stringSummarized description of the entity
]notes object[]
List of session notes generated for the session based on the requested template ids.
Array [sessionDocumentId string requiredUnique identifier of the session document.
noteTemplateId string requiredIdentifier of the note template used to generate this document.
notes object[] required
List of progress notes generated for this session document.
Array [category string requiredProgress note category. For example "ClientPresentation", "TherapeuticIntervention", "Plan",...
categoryName stringHuman readable name for progress note category.
text string requiredProgress note text.
formatting object
nestingLevel integerNesting level of the note in a context of all notes in its category. Notes with leve > 0 can be considered as a bulleted list items.
]]analyticsSummary object
The analytics summary section is intended to show session's talking ratio, speech cadence and response time at-a-glance.
talkingRatio object
The talking time divided by the session time. In other words, the speaking time of the client versus the healing professional, during a single session.
property name* object (TalkingRatioValue)
ratio floatratioLongTerm floatduration floatdurationLongTerm floatspeechCadence object
Number of words per minute of talking time. Speech cadence measures both the client’s and therapists’ words per minute across time, i.e. over the course of a session.
property name* object (SpeechCadenceValue)
cadence floatcadenceLongTerm floatresponseTime object
The average time (in seconds) that it takes a client to respond to a healing professional’s question or statement and vice versa. In other words, this measures the time to reply or respond between therapist and client.
property name* object (ResponseTimeValue)
time floattimeLongTerm floatsentiment object
Sentiment of participants. We identify the number of sentences with Positive, Neutral or Negative sentiment using something called a polarity value. The aim of polarity detection is to find out whether the opinion expressed in a text is positive or negative in how it connects to the topics discussed.
property name* object (SentimentValue)
positive floatpositiveLongTerm floatnegative floatnegativeLongTerm floattimePerspective object
This measure identifies the number of words that are verbs in the Past, Present, or Future tense. The tense analysis is helpful for identifying where a client is stuck in their thinking, especially in connection to the topics discussed.
property name* object (TimePerspectiveValue)
future floatfutureLongTerm floatpast floatpastLongTerm float
{
"sessionId": "string",
"noteTemplateIds": [
"2fb81b0a-611b-11ed-9b6a-0242ac120002"
],
"date": "2024-01-01T12:00:00Z",
"providerUserId": "string",
"clientUserIds": [
"string"
],
"state": "InProgress",
"error": {
"code": "low-content-transcript",
"message": "We're sorry, we couldn't interpret enough audio to generate notes. This can happen when the audio quality is too low, because of a technical issue, or if the recording is too short."
},
"recordingDurationMs": 0,
"recordingUrl": "string",
"transcript": {
"utterances": [
{
"text": "Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.",
"start": 5000,
"end": 8000,
"speaker": "B",
"words": [
{
"text": "Lorem",
"start": 5000,
"end": 5907
}
]
}
],
"speakerMapping": {
"A": "d290f1ee-6c54-4b01-90e6-d701748f0851",
"B": "15fadcc2-9766-40f3-9826-4cbf00f26fa4"
}
},
"topics": [
{
"headline": "The client is new to therapy and finding it interesting but not yet comfortable, but the therapist reassures them that it takes time to adjust and most people find it a safe and comfortable place to talk, while also explaining that therapy is unique in that it's two people focusing on one person.",
"gist": "Adjusting to Therapy Process.",
"start": 250,
"end": 32200
}
],
"namedEntities": [
{
"entityType": "person_name",
"name": "Amelia",
"summary": "Amelia was Tony's patient, a 35 year old woman who died on the table."
}
],
"notes": [
{
"sessionDocumentId": "2fb81b0a-611b-11ed-9b6a-0242ac120002",
"noteTemplateId": "2fb81b0a-611b-11ed-9b6a-0242ac120002",
"notes": [
{
"category": "ClientPresentation",
"categoryName": "Client Presentation",
"text": "The client is hoping to find a balance between work and life in order to improve his relationships.",
"formatting": {
"nestingLevel": 0
}
}
]
}
],
"analyticsSummary": {
"talkingRatio": {
"b3a95591-3458-4878-9637-03e70edbc9af": {
"ratio": 0.25592834,
"ratioLongTerm": 0.25724095,
"duration": 19146,
"durationLongTerm": 19224
},
"d9f78f1f-d4d9-4fb5-a5f4-e566d9478a6b": {
"ratio": 0.66159606,
"ratioLongTerm": 0.6583592,
"duration": 49494,
"durationLongTerm": 49202.453
}
},
"speechCadence": {
"b3a95591-3458-4878-9637-03e70edbc9af": {
"cadence": 150.42307,
"cadenceLongTerm": 151.49664
},
"d9f78f1f-d4d9-4fb5-a5f4-e566d9478a6b": {
"cadence": 184.26476,
"cadenceLongTerm": 185.4981
}
},
"responseTime": {
"b3a95591-3458-4878-9637-03e70edbc9af": {
"time": 788,
"timeLongTerm": 898.84845
},
"d9f78f1f-d4d9-4fb5-a5f4-e566d9478a6b": {
"time": 1006,
"timeLongTerm": 1124.7727
}
},
"sentiment": {
"b3a95591-3458-4878-9637-03e70edbc9af": {
"positive": 0.6666667,
"positiveLongTerm": 0.6298702,
"negative": 0,
"negativeLongTerm": 0
}
},
"timePerspective": {
"b3a95591-3458-4878-9637-03e70edbc9af": {
"future": 0,
"futureLongTerm": 0,
"past": 0.16666667,
"pastLongTerm": 0.14935066
}
}
}
}
Invalid API Key
Record with given ID wasn't found.