PCS Spark can deliver a highly dynamic learning experience, presenting various avatars, locations, facial expressions, verbal responses and physical exams. These are all driven by patient scenario content you can fully control with the PCS content authoring tools. In this article, we’ll describe how to set up your content to customize the above elements of the PCS Spark experience.

Avatar & Environment

You do not directly select a patient avatar for your scenario, but rather the presented avatar is defined by the Age and Gender settings of the scenario. Set these variables as seen below to get the corresponding avatar.

The location of the interview is decided based on the content of the Patient Description field. Whichever key word is first found in the description, the corresponding location is going to be displayed.


When you are ready to converse with your patient, make them start listening: either click the main controller button or call out their first name (e.g. “Hello Vanessa”) if Wake on Name is enabled for your virtual simulator in Simulator Settings.

When you are done conversing with your patient, make them stop listening with another click of the main controller button or saying something similar to “Thank you, take care”. “Bye for now”, or “See you later”.

Voice Memo: push and keep pushed the main controller button to record a brief voice memo. Voice memos are transcribed and saved to the log.


Type in the response:                                Will say:
<year>                                                          Current year
<name> = <fullname>                                  Patient full name e.g. "Valerie Miller
<firstname>                                                  Patient first name / surname e.g. “Valerie”
<lastname>                                                  Patient last name e.g.”Miller”
<age>                                                           Patient age from Patient Editor - Basics
<birthday>                                                    “April 11”
<dob>                                                           “April 11” and year of birth based on age field
<date>                                                          Current date e.g. “28th”
<month>                                                       Current month e.g. “March”
<day>                                                            Current day e.g. “Tuesday”
<season>                                                      Current season e.g. “Summer”

Emotions and Gestures

Avatar keeps eye contact while conversing. Occasionally looks around, folds arms and crosses legs.

Add different emotions and gestures to the conversation responses by typing any of these facial expressions or gestures:

  • <normal>  <scared>  <angry>  <surprised>  <worried>  <exhausted>  <happy> <serious>  <sad>
  • <nod>  <headshake>

Physical Exam

When in VR,  transition to the physical exam to check the patient’s pulses and begin auscultation practice with a verbal request similar to any of the following phrases:

  • Let’s start your physical exam! 
  • Can I check your eye?
  • Can I take your pulse?
  • Let me listen to your heart!
  • Can I have a listen to your chest?
  • Is it okay if I take your pulse?

While sitting on the exam bed for physical exam, a virtual stethoscope is visible when moving the controller. Auscultation sounds are played when control trigger is pointed and pulled at the correct position.

Pulse check provides audio feedback on all headsets and haptic feedback when using Oculus Touch controllers.

When entering physical exam, the ongoing conversation is suspended. However, you can restart listening at any time with any of the usual methods: patient’s first name or more conveniently, the controller button. To end the physical exam, restart listening and use a phrase similar to:

  • Let’s get back to the chair! 
  • We’re done here.
Did this answer your question?