new headshot photo of David J. Zielinski David J. Zielinski

AR/VR Technology Specialist (Analyst, IT)
Office of Information Technology (OIT)
Duke University

Email: djzielin AT duke.edu
Google Scholar page: here
LinkedIn page: here

David J. Zielinski is currently a AR/VR technology specialist for the Duke University OIT Co-Lab (2021-present). Previously the Duke Department of Art, Art History & Visual Studies (2018-2020) and the Duke DiVE Virtual Reality Lab (video) (2004-2018), under the direction of Regis Kopper (2013-2018), Ryan P. McMahan (2012), and Rachael Brady (2004-2012). He received his bachelors (2002) and masters (2004) degrees in Computer Science from the University of Illinois at Urbana-Champaign, where he worked on a suite of virtual reality musical instruments (video) under the guidance of Bill Sherman. He is a VR/AR software developer, researcher, mentor, and educator. His current research interests include virtual musical instruments, social VR, and digital cartography. He has additional interests in audio synthesis, audio signal processing, and physical / embedded computing.

Tutorials (Fall 2024)
Co-Lab Introduction to Virtual Reality Instructor 1 hour workshop
Co-Lab Build a Mobile Game (Introduction to Unity) Instructor 4 x 1 hour workshops
Co-Lab Next-Gen Music Making: Crafting the Future of Sound in VR Instructor 1 hour workshop
Co-Lab Beginners Guide to Arduino: Controlling the Physical World with Code Instructor 2 hour workshop


Mentorship
Bass Neurocities and Ruinscapes: Reconstructing Ancient Cities and Ruins Using Virtual Reality. link Staff Contributor 2023-2024
Code+ “Frictionless" AR: A Museum Exhibit Toolkit. link Project Co-Lead 2021 Summer
Code+ Sandcastle: A Historical City-Builder for Humanities Scholars. link Project Co-Lead 2021 Summer
Bass Virtual Reality and Neuroarchaeology. link Staff Contributor 2020-2021
Bass Smart Archaeology. link Staff Contributor 2019-2020

Courses
ISS 270, ISS 190 Constructing Immersive Virtual Worlds Student project support (Unity + XR Interaction Toolkit) 2024 Fall
2022 Fall
2022 Spring
SCISOC 197 From Siri to Skynet: Our Complex Relationships with Technology Experience VR Fieldtrip (MPS Studio Visit) 2024 Fall
2023 Fall
2022 Fall
COMPSCI 290 Educational Technology Guest Lecture and MPS Studio Visit 2024 Spring
CMAC 660 Games, Play, and Selfhood: Immersive Media and Extended Realities Experience XR Fieldtrip (MPS Studio Visit) 2024 Spring
ISS 305 Virtual Museums: Theories and Methods of Twenty-First-Century Museums. Guest lecturer and student project support (Unity) 2024 Spring
2023 Spring
2020 Fall
2019 Fall
LIT 265S Virtual Realities: Collective Dreams from Plato to Cyberspace Guest Lecturer + Experience VR Fieldtrip (MPS Studio Visit) 2023 Spring
2022 Spring
2021 Spring
Spanish 203 Intermediate Spanish VR Fieldtrip (MPS Visit) 2023 Fall
2023 Summer
2023 Spring
VMS 187 Digital Storytelling and Interactive Narrative Experience VR Fieldtrip (MPS Studio Visit) 2023 Fall
2021 Spring
ARTHIST 89S Why Art: at the Intersection of Human Minds, Cultures and Societies Experience VR Fieldtrip (MPS Studio Visit) 2023 Fall
2020 Spring
OT-D 601 Occupation and Technology VR Demo Session 2023 Fall
2022 Fall
CEE 429 Integrated Structural Design Virtual visit of student's buildings 2022 Spring, 2018 Spring, 2017 Spring, 2016 Spring, 2015 Spring
ISS 320, VMS 326, ISS 720, CMAC 720 Introduction to Programming and User Interface Design in Unity3D. Instructor 2021 Spring - final projects
2019 Fall - final projects
2018 Fall - final projects
2017 Fall - final projects
2016 Fall - final projects
Dance 308 Performance and Technology Guest Lecturer and student project support (Arduino) 2021 Spring
ISS 268 Media History: Old and New Guest Lecturer (Social VR) 2020 Fall
ME 555.07, Previously ME 555.05 Virtual Reality Systems Research and Design. Guest lecturer and student project support (Unity) 2019 Spring
2018 Spring
2017 Spring
ARTHIST 190S.01 Egyptian Art and Archaeology Guest lecturer and Experience VR Fieldtrip (XR Studio visit) 2019 Spring
ECE 590.21 Human-Computer Interaction Guest lecturer (Musical Interfaces: Past to Present) 2018 Fall
CLST 544 Introduction to Digital Archaeology Student project support (Unity) 2018 Fall


Art & Music
collage of images and text "Engine Of Many Senses" (2022/2013). Exhibition at UCI Beall Center for Art + Technology. Generative collage of images, sounds, and 3d models. Original project by Bill Seaman and Todd Berreth in 2013. fixes / modernization by David Zielinski in 2022. C++, OpenGL, OpenFrameworks, Xcode. Announcement   Videos
2 users reach out at floating cubes "Altspace Music Modules" (2021). David J Zielinski, Kenneth D. Stewart, Jil Christensen. A series of music modules that can be used in conjunction with other modules and repatched live. Current modules include a piano, staff, midi player, cube spawner, and more. Our goal is to facilitate musicians, educators, and artists to more easily incorporate interactive musical elements into their Altspace VR worlds. Demo for SIVE Workshop at IEEE VR 2021. Altspace MRE / TypeScript. March 27th, 2021. Video   GitHub
rapper pitbull floating above virtual crowd "YOUNGA After Party with Pitbull" (2020). Embark on a journey around the world in an immersive experience with a pioneering mixed reality performance from Grammy award-winning rapper, songwriter and record producer, Pitbull. Specifically developed a command and control console to allow the production team to dynamically spawn objects during the performance. Altspace MRE / TypeScript. December 5th, 2020. Event Listing   YouTube   GitHub  
wireframe cubes floating above a desert backdrop "Geo Sound" (2020) Ken Stewart and David Zielinski. Created an interactive kinetic musical installation for virtual Burning Man (BCRvr) in Altspace. Users can click on a Geo shape and various sound files are triggered. Additionally when activated the Geo will begin traveling to a new location. Altspace MRE / TypeScript.
ken's avatar standing in front of virtual piano and staff "Educational Music Lecture Series in Social VR" (2020). Ken Stewart and David Zielinski. Created an application within the Altspace platform for an interactive piano and staff. Piano has popups annotations for intervals and notenames. Utilized for 12 x 1 hour lectures during summer of 2020. Altspace MRE / TypeScript.
collage of images behind sanskrit text "Did You Mean?" (2012) by Elizabeth Landesberg, Peter Lisignoli, Laura Dogget. Coding by David Zielinski. Did You Mean explores the implications of using the word "word" to imply a single vessel or container of meaning. It invites DiVE participants to play around inside language itself, moving through the multiplicities of possibilities for meaning in a single word, and especially in translation. In Sanskrit, any number of words can be combined using hyphens and still be considered a single modifier, the longest of which has 54 component words joined by hyphens. Participants can select certain elements of this word with the wand and see and/or hear how they are usually defined on their own. Selecting a part of the word with the wand triggers a sound and/or an image, which build to a cacophony of sometimes meaningful, sometimes jarring collisions. Virtual Reality. Virtools Game Engine. Picture
slot machine surrounded by videos of casino exteriors "BIDDIE'S BIG BENDER" (2012) by Laurenn McCubbin and Marika Borgeson. Coding by David Zielinski. This project is an exploration of the sounds, spaces and visuals of Las Vegas, though the eyes of a senior citizen looking to have a good time in Sin City. Virtual Reality. Virtools Game Engine. Picture
hanging video globes with bright particle explosion in center of image "Recollection Chorus" by Sarah Goetz (2011). Coding by David Zielinski. Recollection Chorus is a network of floating memories at the Duke immersive Virtual Environment. One walks into a virtual space surrounded by white nodes, which play poetic and didactic media clips on subjects explored in my memory mind map. The media, collected from education films of the mid 20th century, will surround the user, creating a cacophony of commentary on memory. Virtual Reality. Virtools Game Engine. Video   Picture

black and white abstract city scape with central tower "The Memory Tower" by Tim Senior (2011). Coding by David Zielinski. The ‘Memory Tower’ is a full-immersion virtual reality environment exploring the neural mechanisms behind the consolidation and reorganization of memory traces during sleep within the brain. The nature of memory formation is reflected through the use of historical, architectural motifs, serving as visual metaphors for both the content and structure of memory traces. By creating a rich cityscape that encompasses more familiar notions of memory from our everyday encounters with the world around us, the work aims to extend beyond a reductionist, scientific description of memory content and to acknowledge notions of collective human experience and history. Virtual Reality. Virtools Game Engine. Picture
virtual collage of rural hut with modern ceiling fan, tv, and bed "Desired Dwellings" by Fatimah Tuggar (2010). Coding by David Zielinski. Desired Dwellings is an interactive virtual artwork that allows the viewer/participant to explore a variety of living experiences. Floating animated elements with audio hover above the heads of viewers: these can be grabbed and placed in any of the living situations. It is also possible to scale rotate and place multiples of any of the elements. Virtual Reality. Virtools Game Engine. Picture
picture of large tiled display wall with one image spread across all the monitors "Link Media Wall" (2009) is a 4x12 high resolution tiled display wall installed in the Link, which is found on the lower level of the Duke Perkins Library. The initial exhibit consisted of photos from the Duke Sidney D. Gamble collection. Coding by David Zielinski. Later developement of the Link Media Wall was done by Todd Berreth and Matthew Kenney.   Website   Picture
child standing in center of gridded performance space "soundSpace" (2008-present) is a 30 x 30 foot sound space where free movement creates sounds based on images captured by a series of web cameras. Each camera sends images to a cluster of computers that determine how much motion is in the space. The specific location of motion determines which specific sounds are generated. Some movement will initiate familiar outdoors noises such as wind chimes, rain, lightning. Other sounds will include musical instruments such as the marimba, gongs, voices, synthesizers and a variety of percussion instruments. NC Museum of Life and Science. Interactive Installation.   Video   Museum Site
interview photo of Butler Lampson "Future and History of the Interface" (2007) was an interactive exhibit consisting of interview videos of technology pioneers and archival footage of early technology demos. The display of the videos and information was triggered by the users activity levels in the CIEMAS interactive studio. Presented during the HASTAC 2007 conference. Interactive Installation.
Rapper J Bully standing in front of computer monitors mounted on wall "MIX TAPEStry" (2006) is an interactive, networked performance of music and graphics triggered by people moving under web cameras in the CIEMAS Studio and in the Krannert Art Museum at UIUC. The music is an adaptation of a hip hop track called "Lemonade" by Duke Adjunct Faculty member, Robi Roberts (a.k.a. J. Bully). J. Bully will perform his song while music and graphics by UIUC faculty member John Jennings are generated simultaneously at both locations. Interactive Installation.   Video   Article
many people moving around room with computer monitors on walls. image trails from photo having long exposure "soundSense" (2004) explores the intersection of reality as it is perceived and communicated by humans and computers. "Reality" in this case means the positions and motions of people in the studio. Although computer systems can detect activity in the room, they are unable to convey it a manner that is as rich in nuance as the actual event. By representing electronically detected data as sound, we explore people’s ability to discover and articulate patterns of sound and then associate that with a vision of reality. Through soundSense , researchers hope to autonomously transform the "sentience" of the room into a sound communication understandable to people. Interactive Installation.   Article   Picture

Digital Humanities
the downtown durham bull but with red eyes, puffing smoke, and images of famous durhamites swirling around "150 Faces of Durham" - an AR exhibt that is anchored to the bull statue in downtown Durham. 150 images of famous Durhamites float around the user, while the bull breathes out smoke. Development: Eleanor Taylor. Mentorship: David Zielinski, David Stein. Based off of National Covid Memorial. Tools: Snapchat / Lenstudio (JavaScript). Hardware: phone/tablet. 2023-2024. image  video
two statues side by side, left is fake, right is real "Missing Statue" - AR experience where user is shown several possible statues for the Duke Chapel missing statue and asked to choose which is correct. Development: Alyssa Ho. tweaks: Eleanor Taylor. Mentorship: David Zielinski, David Stein. Tools: Snapchat / Lenstudio (JavaScript). Hardware: phone/tablet. 2023-2024. image  video
gray buildings with text on left pane "City Viewer" for the Virtual Black Charlotte Project. Large project with a number of GIS, 3D Modeling, and other students. Work by David Zielinski centered around using the babylonjs-mapping to create an in browser experience. Tools: TypeScript. Hardware: Desktop. 2024.
colorful rock formations of the grand canyon national park babylonjs-mapping. opensource NPM module to facilitate utilizing OpenStreetMaps and Mapbox content natively inside BabylonJS. Tools: TypeScript. 2022. NPM  GitHub
hand holding phone with rabbits popping out of screen "Duke Library AR". AR Experience where users can scan the covers of various books in the Duke library and see additional content (movies, 3D models, ect). David Stein, Dmitry Kuznetsov. Additional coding by David Zielinski. Additional content by MPS student staffers. Tools: Unity Game Engine w/ AR Foundation. Hardware: phone/tablet. 2022.
user holding tablet in front of venice map "Ughi Augmented". AR experience part of the "Senses of Venice" exhibition in the Perkins Library. Duke University. Kristin Huffman, Andrea Giordano. Programming: David J. Zielinski. Modeling: Martino Dalla Valle, Cristina Zago, Nevio Danelon. 360 Photography: Noah Michaud, Daphne Turan and Angela Tawfik. Tools: Unity Game Engine w/ AR Foundation. Hardware: Tablet. 2019. News Article  Pictures: 1 2 3 4 5

"Representing Early Modern Venice: Augmented Reality Experiences In Exhibitions". Kristin Love Huffman, Hannah L. Jacobs, David J. Zielinski. International Journal for Digital Art History, no. 6 (October):3.48-3.69. 2023. Link PDF
screenshot of memphis map "Mythological Landscapes and Real Places: Using Virtual Reality to Investigate the Perception of Sacred Space in the Ancient City of Memphis". Nevio Danelon and David J. Zielinski. Conference: Ancient Egypt and New Technology 2019. Indiana University, Bloomington Indiana. Tools: Unity Game Engine. Hardware: Oculus Go. 2019. Abstract   Video   Git Repository   WebGL Version   Build (via SideQuest)   Pictures: 1 2

Published as: "Mythological Landscapes and Real Places: Using Virtual Reality to Investigate the Perception of Sacred Space in the Ancient City of Memphis". N Danelon, DJ Zielinski Ancient Egypt, New Technology, 85-117. 2023. Paper
users standing infront of walls with grids on them, which was the dive vr system "Virtual and Augmented Reality Digital Humanities Institute (V/AR-DHI)". Gave demos and talks at the National Endowment for the Humanities funded Summer 2018 Institute for Advanced Topics in Digital Humanities, HT-256969. Additional 2020 Symposium. PI: Victoria Szabo, Art, Art History & Visual Studies and Information Science + Studies. Co-PI: Philip Stern, History. 2018 and 2020.
map of orixa on center table surronded by several wall maps "Depth Soundings for Orixa: Interactive Diachronistic Visualizations of Portolan Charts & Camel Leopards". Jessica Almy Pagan and David J. Zielinski. CLST 544 Class Final Project. Tools: Unity Game Engine. Hardware: Vive Pro. 2018. Pictures: 1 2 3 4 5 6 7.
view from inside of large roman hall "Regium Lepidi". Maurizio Forte. Nevio Danelon. David J. Zielinski (conversion to HTC Vive Pro). The aim of this project is to enhance the cultural heritage of the city of Reggio Emilia, Italy through the promotion and recognition of its historical significance. Collaboration between Duke University and Lions Club Reggio Emilia Host Città del Tricolore. Tools: Unity Game Engine. Hardware: Vive Pro. 2018. Picture
outside view of statues and entrance of roman forum "Basilica Ulpia". Maurizio Forte. Nevio Danelon. David J. Zielinski (conversion to Oculus Go). The reign of Emperor Trajan (98 AD – 117 AD) marked the highest point of the Roman Empire claiming the peak of the empire’s territorial expansion. The project focuses on the Trajan Forum and in particular, the Basilica Ulpia, the largest basilica of the Roman world. Tools: Unity Game Engine. Hardware: Vive Pro. 2018. Picture
view inside small religous room, walls adorned with paintings "Carmine VR". Kristin L. Huffman. David J. Zielinski. Postdocs: Mirka Dalla Longa, Emanuela Faresin, Giulia Piccinin. Confraternity House of the Carmelites--the Scoletta (Padua, Italy)-- and its frescos. Tools: Unity Game Engine. Hardware: CAVE. Spring 2018. Git Repository
view of Dave wearing shutter glasses, with virtual dig site in the background "Dig@IT: Virtual Reality in Archaeology". Emmanuel Shiferaw, Cheng Ma , Regis Kopper, Maurizio Forte, Nicola Lercari, David J. Zielinski (completed DiVE conversion). Virtual visit to dig site in Catalhoyuk, Turkey. Tools: Unity Game Engine. Hardware: CAVE. Spring 2017. Project Site Git Repository
view of Dave holding VR controller, several students behind, view is right at the edge of the choir screen that is the midpoint of the church "Reimagining a Medieval Choir Screen". Andrea Basso, Elisa Castagna, Lucas Giles, David J. Zielinski, Caroline Bruzelius. Virtual visit to Santa Chiara Church in Naples. Tools: Unity Game Engine. Hardware: CAVE. Fall 2016. Article   Git Repository   Pictures: 1 2 3 4 5 6
user holds vr controller with virtual landscape in the background "Comparison of Interactive Environments for the Archaeological Exploration of 3D Landscape Data". Rebecca Bennett, David J. Zielinski and Regis Kopper. Presented at the IEEE VIS Workshop on 3DVis. Paris, France. Tools: Virtools Game Engine. Hardware: CAVE. 2014. PDF
overview of virtual rendering of roman ruins "La Villa di Livia". Maurizio Forte, Ph.D. Nicola Lercari, Holton Thompson, David J. Zielinski (conversion to Duke DiVE). La Villa Di Livia was created to let users explore a Roman village, both in it’s current state of ruins and in a hypothetical reconstruction of the space. The village ruins are recreated from field research where Professor Forte collects data from archaeological and ancient landscapes using laser scanning, photomodelling, photogrammetry, differential global positioning systems, spatial technologies, movies, and traditional archaeological documentation. Tools: Virtools Game Engine. Hardware: CAVE. 2011. Picture
view inside main hall of solomon's temple, gold chests on either side, engravings on walls "Solomon’s Temple". Holton Thompson, David J. Zielinski, Anathea Portier-Young, Sean Burrus. For a course in the Divinity School, Professor Anathea Portier-Young brought students into the DiVE to discuss a virtual reconstruction of Solomon’s Temple. She paired the immersive experience with her students’ reading of the books of Kings and Chronicles, and timed it to coincide with a lecture on Chronicles that focused on worship. The project allowed students to physically and visually experience the religious environment of an otherwise textually encountered site. Port into the DiVE of a model from the 3D Bible Project. Tools: Virtools Game Engine. Hardware: CAVE. 2011. Picture

Virtual Reality Research
user hold plastic box with virtual cube appearing in center of box "Specimen box: A tangible interaction technique for world-fixed virtual reality displays". David J. Zielinski, Derek Nankivil, Regis Kopper. In 2017 IEEE Symposium on 3D User Interfaces (3DUI). IEEE, Tools: Unity Game Engine. Hardware: CAVE. 2017. Paper Video

"6 degrees-of-freedom manipulation with a transparent, tangible object in world-fixed virtual reality displays". David J. Zielinski, Derek Nankivil, Regis Kopper. Presented as a poster at 2017 IEEE Virtual Reality (VR). IEEE, 2017. PDF Poster

Patented as: "Systems and Methods for Using Sensing of Real Object Position, Trajectory, or Attitude to Enable User Interaction with a Virtual Object." PCT/US2017/053803.
line art drawing of user with red ray emitting from hand "Evaluating the Effects of Image Persistence on Dynamic Target Acquisition in Low Frame Rate Virtual Environments" David J. Zielinski, Hrishikesh M. Rao, Nick Potter, Marc A. Sommer, Lawrence G. Appelbuam, Regis Kopper. IEEE Symposium on 3D User Interfaces (3DUI) 2016. Paper Video

"Evaluating the Effects of Image Persistence on Dynamic Target Acquisition in Low Frame Rate Virtual Environments" David J. Zielinski, Hrishikesh M. Rao, Nick Potter, Lawrence G. Appelbuam, Regis Kopper. Presented as poster at IEEE VR (2016). Honorable Mention for Best Poster Award. PDF Poster
diagram of frame sequence, colored boxes "Exploring the Effects of Image Persistence in Low Frame Rate Virtual Environments." David J. Zielinski, Hrishikesh M. Rao, Marc A. Sommer, Regis Kopper. IEEE VR (2015). Paper
user wearing oculus headset, in front of monitor showing soccer player standing "Enabling Closed-Source Applications for Virtual Reality via OpenGL Intercept Techniques". David J. Zielinski, Ryan McMahan, Solaiman Shokur, Edgard Morya, Regis Kopper. Presented at the SEARIS 2014 workshop. Co-located with the IEEE VR 2014 conference. Paper
wooden boxes on the ground in a dark tunnel "Comparative Study of Input Devices for a VR Mine Simulation". David J. Zielinski, Brendan Macdonald, Regis Kopper. presented as a poster at IEEE VR 2014. PDF
user holding vr controller in front of random curve surface being cut in half "Intercept Tags: Enhancing Intercept-based Systems". David J. Zielinski, Regis Kopper, Ryan P. McMahan, Wenjie Lu, Silvia Ferrari. VRST 2013. Nanyang Technological University, Singapore. October 6th-8th. Published in the conference proceedings. 2013.   paper pdf
side of side picture of user on matlab desktop view of rainbox surface, with user in VR seeing rainbow surface on the right "ML2VR: Providing MATLAB Users an Easy Transition to Virtual Reality and Immersive Interactivity". David J. Zielinski, Ryan P. McMahan, Wenjie Lu, Silvia Ferrari. Presented as a poster at IEEE VR 2013. Pages 83-84. Published in the conference proceedings. 2013.   paper pdf   demo video  
diagram of user standing in center of box, with red circle in the center "Evaluating Display Fidelity and Interaction Fidelity in a Virtual Reality Game". Ryan P. McMahan, Doug A. Bowman, David J. Zielinski, Rachael B. Brady. Presented as a full paper at IEEE VR 2012 conference. March 4th-8th, Orange County, CA. Published in the journal IEEE Transactions on Visualization and Computer Graphics (TVCG). p626-633. 2012.   link   pdf
"Shadow Walking: an Unencumbered Locomotion Technique for Systems with Under-Floor Projection". David J. Zielinski, Ryan P. McMahan, Rachael B. Brady. IEEE Virtual Reality Conference. p. 167-170, March 19-23, 2011.   paper

Psychology
overhead view of brain with EEG colors "Psychophysiological Markers of Performance and Learning during Simulated Marksmanship in Immersive Virtual Reality". Sicong Liu, Jillian M. Clements, Elayna P. Kirsch, Hrishikesh M. Rao, David J. Zielinski, Yvonne Lu, Boyla O. Mainsah, Nicholas D. Potter, Marc A. Sommer, Regis Kopper, Lawrence G. Appelbaum; . J Cogn Neurosci 2021; 33 (7): 1253–1270. Link
grid of colorful brain activity over time "Neurophysiology of visual-motor learning during a simulated marksmanship task in immersive virtual reality." Jillian M. Clements, Regis Kopper, David J. Zielinski, Hrishikesh Rao, Marc A. Sommer, Elayna Kirsch, Boyla O. Mainsah, Leslie M. Collins, and Lawrence G. Appelbaum. In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 451-458. IEEE, 2018. PDF
food items sitting on counter top "Measuring Visual Search and Distraction in Immersive Virtual Reality". Olk, Bettina; Dinu, Alina; Zielinski, David; Kopper, Regis. Royal Society Open Science. 2018. Link
line graphs "Sensorimotor learning during a marksmanship task in immersive virtual reality". Hrishikesh M. Rao, Rajan Khanna, David J. Zielinski, Yvonne Lu, Jillian M. Clements, Nicholas D. Potter, Marc A. Sommer, Regis Kopper, and Lawrence Gregory Appelbaum. Front. Psychol. | doi: 10.3389/fpsyg.2018.00058. January 2018. link
black and white brain image with red and blue overlays added "Medial prefrontal pathways for the contextual regulation of extinguished fear in humans". Fredrik Ahs, Philip A Kragel, David J Zielinski, Rachael Brady, Kevin S LaBar. NeuroImage 2015. Link
line graph with small pictures underneath "Spatial proximity amplifies valence in emotional memory and defensive approach-avoidance." Fredrik Ahs, Joseph E. Dunsmoor, David J. Zielinski, and Kevin S. LaBar. Neuropsychologia (2014). link
man standing right in front of us, with hallway in the background "Extinction in multiple virtual reality contexts diminishes fear reinstatement in humans." Joseph E. Dunsmoor, Fredrik Ahs, David J. Zielinski, Kevin S. LaBar. Neurobiology of learning and memory (2014). link
"Effects of virtual environment platforms on emotional responses". Kwanguk Kim, Zachary M Rosenthal, David J Zielinski, Rachael Brady. Computer Methods and Programs in Biomedicine. 2014 Jan 7. link

"Comparison of Desktop, Head Mounted Display, and Six Wall Fully Immersive Systems using a Stressful Task". Kwanguk Kim, M. Zachary Rosenthal, David J. Zielinski and Rachael Brady. Presented as a Poster at the IEEE VR 2012 Conference, March 4th-8th. Orange Country, CA. Published in the conference proceedings: p143-144. 2012.   pdf
"Revealing context-specific conditioned fear memories with full immersion virtual reality". Nicole Huff, Jose Alba Hernandez, Matthew Fecteau, David J. Zielinski, Rachael Brady, Kevin S LaBar. Frontiers in Behavioral Neuroscience. 2011. link

"Human Fear Conditioning Conducted in Full Immersion 3-Dimensional Virtual Reality". Nicole C. Huff, David J. Zielinski, Matthew E. Fecteau, Rachael Brady, Kevin S. LaBar. JoVE - Journal of Visualized Experiments. 2010.   link
"Virtual Kitchen". Zachary Rosenthal. David J. Zielinski, Rachael Brady. Holton Thompson. This interactive 3D kitchen scene aims to understand how the test subjects react and behave in stressful immersive environments. The test subjects are asked to perform two tasks during their nine minutes in the environment, one of which is impossible to complete. The environment becomes increasingly stressful through irritating audio cues and the frustration of the inability to complete one of the tasks. 2007. Picture

Domain Specific Applications of Virtual Reality / Miscellaneous
colorful view of all veins and arteries in standing human Blood Flow Animation - work to take Randles Lab directory of colored PLYs, and automatically turn them into a single UV'd model with each PLY becoming a texture. Flipping through the textures allows us to easily animate. David Zielinski (Co-Lab/MPS FTE), Cyrus Tanade, August Wendell, Amanda Randles. 2023-2024. Houdini. image  video
nurse wearing headset on the left, with computer screen showing virtual patient on the right VR Nursing Training. Exploratory project to utilize ChatGPT to aid in nursing training. Weijun Li (MPS Dev Student), David Zielinski (Co-Lab/MPS FTE), Kimberly Coston (Duke RN). Unity w/ XR Interaction Toolkit. 2023-2024. hi res image
white plastic head with overlaid red virtual structure Neurology Surgery Simulator (EVD Placement). Utilizing Microsoft HoloLens. Summer 2017. Patented as: "Augmented Reality-Based Navigation for Use in Surgical and Non-Surgical Procedures". PCT/US18/17381
professor and student standing in front of 3D molecule "Exploring Proteins". NEUROBIO 719 - Concepts in Neuroscience I. Professor Jorg Grandl. Spring 2017. Picture
several students wearing black shutter glasses, with building they designed in the background "Exploring Building Designs". CEE 429 - Integrated Structural Design. Professor Joe Nadeau. Student final projects. Spring 2017. Picture
several black and white "Wireless, Web-Based Interactive Control of Optical Coherence Tomography with Mobile Devices." Mehta, Rajvi, Derek Nankivil, David J. Zielinski, Gar Waterman, Brenton Keller, Alexander T. Limkakeng, Regis Kopper, Joseph A. Izatt, and Anthony N. Kuo. Translational vision science & technology 6, no. 1 (2017): 5-5. Link   Picture
students standing in front of white grid, with red spheres "Visualizing Blood Flow Through Arteries". BME 307: Transport Phenomena in Biological Systems. Professor Amanda Randles. Student final projects. Spring 2017. Picture
robot on brown background with code on right side of screen "Klampt Web Interface". Web front end for robot planning library. David J. Zielinski and Kris Hauser. Utilizing: javascript, jquery, three.js, websockets, C++ backend (w/ embedded python interpreter). Summer 2016.
picture of rendered soccer player standing "The Walk Again Project: An EEG/EMG training paradigm to control locomotion". Fabricio L Brasil, Renan C Moioli, Kyle Fast, Anthony L Lin, Nicole A Peretti, Angelo Takigami, Kenneth R Lyons, David J Zielinski, Lumy Sawaki, Sanjay S Joshi, Edgard Morya, Miguel AL Nicolelis. Poster at Society for Neuroscience (SFN) Conference. 2014.
red vertical vessle in foreground "Three Dimensional Reconstruction of Premature Infant Retinal Vessels". Ramiro S. Maldonado, MD Cynthia A. Toth, MD, David J. Zielinski.Using volume-rendering software (Avizo), we produced a three dimensional reconstruction of normal and abnormal vessels in these infants. 2012. Picture
lots of small purple verticle tubes "Smoking and How it Changes the Brain - An Adventure in the Duke Immersive Virtual Environment". Rochelle D. Schwartz-Bloom, Rachael Brady. David J. Zielinski (programming). Holton Thompson (modeling). Jonathan Bloom (narration). Travel into the avatar’s brain to the “reward pathway”. There, you will interact with nicotine molecules to learn how smoking changes receptors for nicotine on the neurons that provide pleasurable feelings. You’ll take a ride along the reward pathway. 2012. Video   Picture
countertop with soda machine on the right "Duke Food". Emily Mass, Rachael Brady, Holton Thompson, David J. Zielinski, Paul Gratham, Sarah McGowan, Charlotte Clark, Melanie Green, Jocelyn Wells, Brandon Pierce. Duke Food is meant to test sustainable eating habits. The environment is modeled after the Great Hall on Duke’s West Campus. The virtual eatery allows the user to browse through the dining hall stations–putting food on and off their tray as they interact with the environment. Every food item has a visual and audio stimulus paired with it and can be selected and “de-selected” once the stimulus occurs. The stimuli are separated into two categories: good and bad. The stimuli are chosen for each food based off of the number of carbon dioxide equivalent (CO2e) points the food item was awarded by the Bon Appetit Carbon Calculator. 2011.
instructor pointing out, with students nearby, motion blured molecules in the background "KinImmerse: Macromolecular VR for NMR ensembles". Jeremy N Block, David J Zielinski, Vincent B Chen, Ian W Davis, Claire E Vinson, Rachael Brady, Jane S Richardson, and David C Richardson. Source Code for Biology and Medicine 2009.   link   paper   code   Picture
transparent man with internal organs visible "DiVE into Alcohol". Rochelle D Schwartz-Bloom, Rachael Brady, Marcel Yang, David P. McMullen. David J. Zielinski (additional fixes). DiVE into Alcohol is a virtual reality (VR) program that can be used in chemistry education at the high school and college level, both as an immersive experience, or as a web-based program. The program is presented in the context of an engaging topic--the oxidation of alcohol based on genetic differences in metabolism within the liver cell. The user follows alcohol molecules through the body to the liver and into the enzyme active site where the alcohol is oxidized. Video. 2009.
tree canopy with stream lines above "VR Visualization as an Interdisciplinary Collaborative Data Exploration Tool for Large Eddy Simulations of Biosphere-Atmosphere Interactions". Gil Bohrer, Marcos Longo, David J. Zielinski, Rachael Brady. Proceedings of the International Symposium on Advances in Visual Computing (ISVC). Las Vegas, Nevada, December 1-3, 2008.   link   paper  video
"Exploring Semantic Social Networks using Virtual Reality". Harry Halpin, David J Zielinski, Rachael Brady, Glenda Kelly. Proceedings of the International Semantic Web Conference (ISWC). Karlsruhe, Germany, October 26-30, 2008.   link   paper

"Redgraph: Navigating Semantic Web Networks using Virtual Reality", Harry Halpin, David J. Zielinski, Rachael Brady, Glenda Kelly. Presented as a poster at the IEEE Virtual Reality 2008 Conference, Reno NV. March 8-12. Published in the conference proceedings p 257-258. 2008. link   PDF   Picture