Posts tagged medical
Ahhh… Nothing like the rush and hustle of a Spring semester to keep us from new pursuits and interests ;-) I do sincerely hope to keep a better, regular pace with my entries than the one I’ve managed over the past two weeks. On to the idea…
A third idea for facilitating Health Sciences and Science studies via Second Life is one I’m labeling “Gulliver’s Anatomy.” The concept is simple, but the implementation will take a collaborative effort of quality builders and subject matter experts.
Essentially, if we can build buildings to walk through and tour in Second Life, can we not also build a virtual “person” through which we can walk and tour. I created a short video three weeks ago which I was going to use to demonstrate, although VERY crudely, the concept. That may still be of use; however, the NMC Connect Arts Symposium (February 11-13, 2007) provided a much more thorough example of how this could be achieved.
The Penn State Virtual Worlds blog offered an entry regarding the NMC event and, in particular, “a community build” of a “‘giant’ on his back.” The image caught my attention.
Now… Imagine the physical characteristics of the objects (prims) used to create this familiar giant were set to phantom, thus allowing SL residents to walk inside or through the body. Once inside the virtual “skin” of the giant, learners discover virtual anatomy: heart, liver, colon, stomach, muscle tissue, nerves, bones, ligaments, tendons etc. With quality builders and subject matter experts, an expertly crafted model could be created. Such an expert model is not necessarily a new capability enabled by SL.
With SL, each element of the anatomy may easily contain its own reference materials and/or be entirely interactive. Clicking on the heart offers a learner and SL resident a note card with details, facts, links to web-based resources designed to support this virtual tour of the human body. Also, I’ve seen water slides in SL. Would it not be possible for younger learners, on the teen grid, to engage the concept of respiration by riding a blood cell as it travels through the human body and blood vessels, from lung to heart to cell and back again? Exciting “stuff!”
I have several specific ideas for using the capabilities of Second Life to enhance the teaching and learning experience in Health Science learning spaces – virtual, online, hybrid or face-to-face. Some of those ideas focus on “micro-simulations” – scenarios that focus on very specific skills related to the health sciences.
One such micro-simulation – dubbed IV Starter for now – focuses on the skill of inserting a syringe into an artery for the purpose of starting an IV or drawing blood. It’s difficult to put all of the details down in text, so the video capture, while of a very rough, manually controlled prototype created by an instructional designer (not a proficient SL builder), demonstrates the concept.
Learners are presented with a model of a forearm and of a syringe or IV needle (please excuse any terminology errors – I’m not a subject matter expert). Given a proficient builder, scripter, avatar skin designer and texture developer, the models can be created to be quality representations of the RL elements. The forearm should include internal features as well: arteries, veins, perhaps muscle, nerve and bone tissue. The purpose of the activity is for learners to demonstrate the ability to manipulate a syringe at the appropriate angle and insertion point to properly access the artery.
The arm model could be created to allow varying settings to control the level of difficulty: transparency of the skin, size of the artery, “angle guides” for the syringe, and level of distracting detail. The syringe could be scripted such that avatars may pick up and control the orientation of the syringe to the arm to ensure the correct angle. And, given a particular point on the artery and the point of the syringe, scripting should be possible to evaluate the angle of the syringe in relation to the artery at the moment the two collide.
Further, given Second Life’s ability to transfer data to/from the web, upon engaging the simulation, learners could be asked to submit their names, and their activities and attempts with the simulation tool could be recorded for assessment and re-training purposes. Ultimately, such an in-world simulation, particularly if it’s Open Content as it should be, could provide an inexpensive tool to supplement other clinical preparation work. Once developed, the scripting logic could be reused for other health science related simulations.
First is Clinical Simulations using an interactive, case study data set “worn” by an avatar. It’s difficult to put all of the details down in text, so the video capture, while of a very rough, hard-coded prototype written by an instructional designer/technologist (not a subject matter expert), demonstrates the concept.
At least two people, using their respective avatars, participate in the role play simulation: one as the Nurse and the other as the Patient. However, the innovation with Second Life comes as each avatar “wears” their role by dropping onto their avatar a case study folder that draws on data and images stored via the web and/or Second Life. When the Patient wears the case study, the shape and clothes of their avatar are changed to match the description of the patient in the study: relative height, weight, age etc. The case study folder is programmed to provide the Patient with notes describing their symptoms which supports their ability to offer specific details through the role play experience, and the case study folder also attaches to the Patient interactive “clinical buttons.”
Clinical Buttons, as I’m calling them, are small prims (basic second life objects) with programming that provides certain interactions and medical options when touched. For example, a right arm clinical button may offer, as the video shows, the Nurse several options when she touches (clicks on) it: check blood pressure, draw blood, give injection or start I.V. The programming for the button includes subsequent options and results as well, when appropriate: i.e. if drawing blood, for what purpose will it be tested? and, when tested, what are the results?
The optimal implementation is for the programming in each Clinical Button to be generic as possible by pulling all menu options, choices, subsequent tests and test results from an Open Content web-based database. This may enable a wide range of very powerful features. First, professors will be able to create their own case studies and content by filling out web-based forms that interface with the database. Second, faculty would then be able to browse the database to identify specific case studies pertinent to their course content. The case study folders can be programmed to listen for a case number once it’s been attached to an avatar: allowing learners to select specific, assigned case studies. Fourth, using images and sounds uploaded to the web or to SL, the simulations can return as results an image of a burn or the sound of the lungs breathing rather than a textual description by the simulation or by individuals participating in the role play.
Certainly, I do not assume that this is a completely original idea on my part; my ego’s not that confident ;-) However, I have searched the web using a variety of keywords and phrases to try to find any similar works in progress but have not been able to locate this sort of clinical simulation. I did find a video from a 2005 presentation that demonstrates the clinical role play possibilities, which is an element of the simulation I have in mind. Also, the SimTeach SLED discussion forum contained an October 2006 post that described a virtual hospital project that is/was looking for others working on medical simulations. And, I encountered a YouTube video of a build of a clinic along with numerous mentions of abnormal psychology simulations in Second Life. With that said, if you’re working on a project similar to this one (or any others I didn’t mention), I’d appreciate an email and a slurl to the in-world location.