David J. Zielinski

Smith Media Labs Technology Specialist (Analyst, IT)
Department of Art, Art History & Visual Studies
Duke University

Email: djzielin AT duke.edu
Office: Smith Warehouse. Bay 11. A231
Phone: 919-684-2208
Google scholar page: here

David J. Zielinski is currently a technology specialist for the Department of Art, Art History & Visual Studies (2018-present). Previously a member of the DiVE Virtual Reality Lab (2004-2018), under the direction of Regis Kopper (2013-2018), Ryan P. McMahan (2012), and Rachael Brady (2004-2012). He received his bachelors (2002) and masters (2004) degrees in Computer Science from the University of Illinois at Urbana-Champaign, where he worked on a suite of virtual reality musical instruments (video) under the guidance of William R. Sherman. He is an experienced VR/AR software developer, as well as having experience in the hardware configuration and systems integration issues of setting up and maintaining virtual reality systems. His current research interests include: domain specific applications of VR, social VR, and VR/AR musical instruments. He has additional interests in real time audio synthesis and audio signal processing.

ISS 320, VMS 326 Introduction to Programming and User Interface Design in Unity3D. Instructor 2019 Fall - final projects
2018 Fall - final projects
2017 Fall - final projects
2016 Fall - final projects
ISS 268 Media History: Old and New Guest Lecturer 2020 Fall
ISS 305 Virtual Museums Guest lecturer and implementation support for students 2020 Fall
2019 Fall
ARTHIST 89S Why Art: at the Intersection of Human Minds, Cultures and Societies Developed a virtual gallery for a group field trip for the "Sarcophagus of the Spouses" 2020 Spring
ARTHIST 190S.01 Egyptian Art and Archaeology Guest lecturer. Introduction to VR lecture and XR Studio visit 2019 Spring
ECE 590.21 Human-Computer Interaction Guest lecturer. Musical Interfaces: Past to Present 2018 Fall
CLST 544 Introduction to Digital Archaeology Implementation support for students: Oculus Go, HTC Vive Pro 2018 Fall
ME 555.07, Previously ME 555.05 Virtual Reality Systems Research and Design. Guest lecturer and implementation support for students 2019 Spring
2018 Spring
2017 Spring

2/17/2020 VARDHI XR "Tech Retreat" "Developing Mobile AR with Unity" and "Augmenting the Lodovico Ughi Venice Map"
3/26/2019 Franklin Humanities Institute Digital Humanities Week "Futuristic Publishing Forms for Digital and Hybrid Scholarship" "Challenges of VR Application Distribution". Slides  Video
10/6/2018 - 10/7/2018 Duke TIP Scholar Weekend Outreach to academically talented middle school students. VR Demos and Introduction to Unity.
7/23/2018 - 8/3/2018 V/AR-DHI (VR/AR Institute) Technical assistance for presenter VR demos. Demo sessions of the Duke DiVE, Microsoft Hololens, Oculus Go. (link)
12/3/2012 Smith Warehouse Rendezvous "1-Bit Audio and the Arduino" (slides)

Digital Humanities Projects
"Ughi Augmented". AR experience part of the "Senses of Venice" exhibition in the Perkins Library. Duke University. Kristin Huffman, Andrea Giordano. Programming: David J. Zielinski. Modeling: Martino Dalla Valle, Cristina Zago, Nevio Danelon. 360 Photography: Noah Michaud, Daphne Turan and Angela Tawfik. Unity Game Engine. 2019. News Article  Git Repository  Pictures: 1 2 3 4 5
"Mythological Landscapes and Real Places: Using Virtual Reality to Investigate the Perception of Sacred Space in the Ancient City of Memphis". Nevio Danelon and David J. Zielinski. Conference: Ancient Egypt and New Technology 2019. Indiana University, Bloomington Indiana. Unity Game Engine. 2019. Abstract   Video   Pictures: 1 2
"Depth Soundings for Orixa: Interactive Diachronistic Visualizations of Portolan Charts & Camel Leopards". Jessica Almy Pagan and David J. Zielinski. CLST 544 Class Final Project. Unity Game Engine. 2018. Screen Shots: 1 2 3 4 5 6 7.
"Regium Lepidi". Maurizio Forte. Nevio Danelon. David J. Zielinski (conversion to HTC Vive Pro). The aim of this project is to enhance the cultural heritage of the city of Reggio Emilia, Italy through the promotion and recognition of its historical significance. Collaboration between Duke University and Lions Club Reggio Emilia Host Città del Tricolore. Unity Game Engine. 2018.
"Basilica Ulpia". Maurizio Forte. Nevio Danelon. David J. Zielinski (conversion to Oculus Go). The reign of Emperor Trajan (98 AD – 117 AD) marked the highest point of the Roman Empire claiming the peak of the empire’s territorial expansion. The project focuses on the Trajan Forum and in particular, the Basilica Ulpia, the largest basilica of the Roman world. Unity Game Engine. 2018.
"Carmine VR". Kristin L. Huffman. David J. Zielinski. Postdocs: Mirka Dalla Longa, Emanuela Faresin, Giulia Piccinin. Confraternity House of the Carmelites--the Scoletta (Padua, Italy)-- and its frescos. Unity Game Engine. Spring 2018. Git Repository
"Dig@IT: Virtual Reality in Archaeology". Emmanuel Shiferaw, Cheng Ma , Regis Kopper, Maurizio Forte, Nicola Lercari, David J. Zielinski (completed DiVE conversion). Virtual visit to dig site in Catalhoyuk, Turkey. Unity Game Engine. Spring 2017. Project Site Git Repository
"Reimagining a Medieval Choir Screen". Andrea Basso, Elisa Castagna, Lucas Giles, David J. Zielinski, Caroline Bruzelius. Virtual visit to Santa Chiara Church in Naples. Unity Game Engine. Fall 2016. Article   Git Repository
"Comparison of Interactive Environments for the Archaeological Exploration of 3D Landscape Data". Rebecca Bennett, David J. Zielinski and Regis Kopper. Presented at the IEEE VIS Workshop on 3DVis. Paris, France. Virtools Game Engine. 2014. PDF
"La Villa di Livia". Maurizio Forte, Ph.D. Nicola Lercari, Holton Thompson, David J. Zielinski (conversion to Duke DiVE). La Villa Di Livia was created to let users explore a Roman village, both in it’s current state of ruins and in a hypothetical reconstruction of the space. The village ruins are recreated from field research where Professor Forte collects data from archaeological and ancient landscapes using laser scanning, photomodelling, photogrammetry, differential global positioning systems, spatial technologies, movies, and traditional archaeological documentation. Virtools Game Engine. 2011.
"Solomon’s Temple". Holton Thompson, David J. Zielinski, Anathea Portier-Young, Sean Burrus. For a course in the Divinity School, Professor Anathea Portier-Young brought students into the DiVE to discuss a virtual reconstruction of Solomon’s Temple. She paired the immersive experience with her students’ reading of the books of Kings and Chronicles, and timed it to coincide with a lecture on Chronicles that focused on worship. The project allowed students to physically and visually experience the religious environment of an otherwise textually encountered site. Port into the DiVE of a model from the 3D Bible Project. Virtools Game Engine. 2011.

Virtual Reality Art Collaborations
"Did You Mean?" (2012) by Elizabeth Landesberg, Peter Lisignoli, Laura Dogget. Did You Mean explores the implications of using the word "word" to imply a single vessel or container of meaning. It invites DiVE participants to play around inside language itself, moving through the multiplicities of possibilities for meaning in a single word, and especially in translation. In Sanskrit, any number of words can be combined using hyphens and still be considered a single modifier, the longest of which has 54 component words joined by hyphens. Participants can select certain elements of this word with the wand and see and/or hear how they are usually defined on their own. Selecting a part of the word with the wand triggers a sound and/or an image, which build to a cacophony of sometimes meaningful, sometimes jarring collisions.
"BIDDIE'S BIG BENDER" (2012) by Laurenn McCubbin and Marika Borgeson. This project is an exploration of the sounds, spaces and visuals of Las Vegas, though the eyes of a senior citizen looking to have a good time in Sin City.
"Recollection Chorus" by Sarah Goetz (2011). Recollection Chorus is a network of floating memories at the Duke immersive Virtual Environment. One walks into a virtual space surrounded by white nodes, which play poetic and didactic media clips on subjects explored in my memory mind map. The media, collected from education films of the mid 20th century, will surround the user, creating a cacophony of commentary on memory. Video

"The Memory Tower" by Tim Senior (2011). The ‘Memory Tower’ is a full-immersion virtual reality environment exploring the neural mechanisms behind the consolidation and reorganization of memory traces during sleep within the brain. The nature of memory formation is reflected through the use of historical, architectural motifs, serving as visual metaphors for both the content and structure of memory traces. By creating a rich cityscape that encompasses more familiar notions of memory from our everyday encounters with the world around us, the work aims to extend beyond a reductionist, scientific description of memory content and to acknowledge notions of collective human experience and history.
"Desired Dwellings" by Fatimah Tuggar (2010). Desired Dwellings is an interactive virtual artwork that allows the viewer/participant to explore a variety of living experiences. Floating animated elements with audio hover above the heads of viewers: these can be grabbed and placed in any of the living situations. It is also possible to scale rotate and place multiples of any of the elements.

Virtual Reality Research
"Specimen box: A tangible interaction technique for world-fixed virtual reality displays". David J. Zielinski, Derek Nankivil, Regis Kopper. In 2017 IEEE Symposium on 3D User Interfaces (3DUI). IEEE, 2017. Paper Video

"6 degrees-of-freedom manipulation with a transparent, tangible object in world-fixed virtual reality displays". David J. Zielinski, Derek Nankivil, Regis Kopper. Presented as a poster at 2017 IEEE Virtual Reality (VR). IEEE, 2017. PDF Poster

Patented as: "stems and Methods for Using Sensing of Real Object Position, Trajectory, or Attitude to Enable User Interaction with a Virtual Object." PCT/US2017/053803.
"Evaluating the Effects of Image Persistence on Dynamic Target Acquisition in Low Frame Rate Virtual Environments" David J. Zielinski, Hrishikesh M. Rao, Nick Potter, Marc A. Sommer, Lawrence G. Appelbuam, Regis Kopper. IEEE Symposium on 3D User Interfaces (3DUI) 2016. Paper Video

"Evaluating the Effects of Image Persistence on Dynamic Target Acquisition in Low Frame Rate Virtual Environments" David J. Zielinski, Hrishikesh M. Rao, Nick Potter, Lawrence G. Appelbuam, Regis Kopper. Presented as poster at IEEE VR (2016). Honorable Mention for Best Poster Award. PDF Poster
"Exploring the Effects of Image Persistence in Low Frame Rate Virtual Environments." David J. Zielinski, Hrishikesh M. Rao, Marc A. Sommer, Regis Kopper. IEEE VR (2015). Paper
"Enabling Closed-Source Applications for Virtual Reality via OpenGL Intercept Techniques". David J. Zielinski, Ryan McMahan, Solaiman Shokur, Edgard Morya, Regis Kopper. Presented at the SEARIS 2014 workshop. Co-located with the IEEE VR 2014 conference. Paper
"Comparative Study of Input Devices for a VR Mine Simulation". David J. Zielinski, Brendan Macdonald, Regis Kopper. presented as a poster at IEEE VR 2014. PDF
"Intercept Tags: Enhancing Intercept-based Systems". David J. Zielinski, Regis Kopper, Ryan P. McMahan, Wenjie Lu, Silvia Ferrari. VRST 2013. Nanyang Technological University, Singapore. October 6th-8th. Published in the conference proceedings. 2013.   paper pdf
"ML2VR: Providing MATLAB Users an Easy Transition to Virtual Reality and Immersive Interactivity". David J. Zielinski, Ryan P. McMahan, Wenjie Lu, Silvia Ferrari. Presented as a poster at IEEE VR 2013. Pages 83-84. Published in the conference proceedings. 2013.   paper pdf   demo video  
"Evaluating Display Fidelity and Interaction Fidelity in a Virtual Reality Game". Ryan P. McMahan, Doug A. Bowman, David J. Zielinski, Rachael B. Brady. Presented as a full paper at IEEE VR 2012 conference. March 4th-8th, Orange County, CA. Published in the journal IEEE Transactions on Visualization and Computer Graphics (TVCG). p626-633. 2012.   link   pdf
"Shadow Walking: an Unencumbered Locomotion Technique for Systems with Under-Floor Projection". David J. Zielinski, Ryan P. McMahan, Rachael B. Brady. IEEE Virtual Reality Conference. p. 167-170, March 19-23, 2011.   paper

Domain Specific Applications of Virtual Reality
"Measuring Visual Search and Distraction in Immersive Virtual Reality". Olk, Bettina; Dinu, Alina; Zielinski, David; Kopper, Regis. Royal Society Open Science. 2018. Link
"Sensorimotor learning during a marksmanship task in immersive virtual reality". Hrishikesh M. Rao, Rajan Khanna, David J. Zielinski, Yvonne Lu, Jillian M. Clements, Nicholas D. Potter, Marc A. Sommer, Regis Kopper, and Lawrence Gregory Appelbaum. Front. Psychol. | doi: 10.3389/fpsyg.2018.00058. January 2018. link
"Wireless, Web-Based Interactive Control of Optical Coherence Tomography with Mobile Devices." Mehta, Rajvi, Derek Nankivil, David J. Zielinski, Gar Waterman, Brenton Keller, Alexander T. Limkakeng, Regis Kopper, Joseph A. Izatt, and Anthony N. Kuo. Translational vision science & technology 6, no. 1 (2017): 5-5. Link
Neurology Surgery Simulator (EVD Placement). Utilizing Microsoft HoloLens. Summer 2017. Patented as: "Augmented Reality-Based Navigation for Use in Surgical and Non-Surgical Procedures". PCT/US18/17381
"Exploring Proteins". NEUROBIO 719 - Concepts in Neuroscience I. Professor Jorg Grandl. Spring 2017.
"Exploring Building Designs". CEE 429 - Integrated Structural Design. Professor Joe Nadeau. Student final projects. Spring 2017.
"Visualizing Blood Flow Through Arteries". BME 307: Transport Phenomena in Biological Systems. Professor Amanda Randles. Student final projects. Spring 2017.
"Klampt Web Interface". Web front end for robot planning library. David J. Zielinski and Kris Hauser. Utilizing: javascript, jquery, three.js, websockets, C++ backend (w/ embedded python interpreter). Summer 2016.
"Medial prefrontal pathways for the contextual regulation of extinguished fear in humans". Fredrik Ahs, Philip A Kragel, David J Zielinski, Rachael Brady, Kevin S LaBar. NeuroImage 2015. Link
"Spatial proximity amplifies valence in emotional memory and defensive approach-avoidance." Fredrik Ahs, Joseph E. Dunsmoor, David J. Zielinski, and Kevin S. LaBar. Neuropsychologia (2014). link
"The Walk Again Project: An EEG/EMG training paradigm to control locomotion". Fabricio L Brasil, Renan C Moioli, Kyle Fast, Anthony L Lin, Nicole A Peretti, Angelo Takigami, Kenneth R Lyons, David J Zielinski, Lumy Sawaki, Sanjay S Joshi, Edgard Morya, Miguel AL Nicolelis. Poster at Society for Neuroscience (SFN) Conference. 2014.
"Extinction in multiple virtual reality contexts diminishes fear reinstatement in humans." Joseph E. Dunsmoor, Fredrik Ahs, David J. Zielinski, Kevin S. LaBar. Neurobiology of learning and memory (2014). link
"Effects of virtual environment platforms on emotional responses". Kwanguk Kim, Zachary M Rosenthal, David J Zielinski, Rachael Brady. Computer Methods and Programs in Biomedicine. 2014 Jan 7. link

"Comparison of Desktop, Head Mounted Display, and Six Wall Fully Immersive Systems using a Stressful Task". Kwanguk Kim, M. Zachary Rosenthal, David J. Zielinski and Rachael Brady. Presented as a Poster at the IEEE VR 2012 Conference, March 4th-8th. Orange Country, CA. Published in the conference proceedings: p143-144. 2012.   pdf
"Three Dimensional Reconstruction of Premature Infant Retinal Vessels". Ramiro S. Maldonado, MD Cynthia A. Toth, MD, David J. Zielinski.Using volume-rendering software (Avizo), we produced a three dimensional reconstruction of normal and abnormal vessels in these infants. 2012.
"Smoking and How it Changes the Brain - An Adventure in the Duke Immersive Virtual Environment". Rochelle D. Schwartz-Bloom, Rachael Brady. David J. Zielinski (programming). Holton Thompson (modeling). Jonathan Bloom (narration). Travel into the avatar’s brain to the “reward pathway”. There, you will interact with nicotine molecules to learn how smoking changes receptors for nicotine on the neurons that provide pleasurable feelings. You’ll take a ride along the reward pathway. 2012. Video
"Revealing context-specific conditioned fear memories with full immersion virtual reality". Nicole Huff, Jose Alba Hernandez, Matthew Fecteau, David J. Zielinski, Rachael Brady, Kevin S LaBar. Frontiers in Behavioral Neuroscience. 2011. link

"Human Fear Conditioning Conducted in Full Immersion 3-Dimensional Virtual Reality". Nicole C. Huff, David J. Zielinski, Matthew E. Fecteau, Rachael Brady, Kevin S. LaBar. JoVE - Journal of Visualized Experiments. 2010.   link
"Duke Food". Emily Mass, Rachael Brady, Holton Thompson, David J. Zielinski, Paul Gratham, Sarah McGowan, Charlotte Clark, Melanie Green, Jocelyn Wells, Brandon Pierce. Duke Food is meant to test sustainable eating habits. The environment is modeled after the Great Hall on Duke’s West Campus. The virtual eatery allows the user to browse through the dining hall stations–putting food on and off their tray as they interact with the environment. Every food item has a visual and audio stimulus paired with it and can be selected and “de-selected” once the stimulus occurs. The stimuli are separated into two categories: good and bad. The stimuli are chosen for each food based off of the number of carbon dioxide equivalent (CO2e) points the food item was awarded by the Bon Appetit Carbon Calculator. 2011.
"KinImmerse: Macromolecular VR for NMR ensembles". Jeremy N Block, David J Zielinski, Vincent B Chen, Ian W Davis, E Claire Vinson, Rachael Brady, Jane S Richardson, and David C Richardson. Source Code for Biology and Medicine 2009.   link   paper   code
"DiVE into Alcohol". Rochelle D Schwartz-Bloom, Rachael Brady, Marcel Yang, David P. McMullen. David J. Zielinski (additional fixes). DiVE into Alcohol is a virtual reality (VR) program that can be used in chemistry education at the high school and college level, both as an immersive experience, or as a web-based program. The program is presented in the context of an engaging topic--the oxidation of alcohol based on genetic differences in metabolism within the liver cell. The user follows alcohol molecules through the body to the liver and into the enzyme active site where the alcohol is oxidized. Video. 2009.
"VR Visualization as an Interdisciplinary Collaborative Data Exploration Tool for Large Eddy Simulations of Biosphere-Atmosphere Interactions". Gil Bohrer, Marcos Longo, David J. Zielinski, Rachael Brady. Proceedings of the International Symposium on Advances in Visual Computing (ISVC). Las Vegas, Nevada, December 1-3, 2008.   link   paper  video
"Exploring Semantic Social Networks using Virtual Reality". Harry Halpin, David J Zielinski, Rachael Brady, Glenda Kelly. Proceedings of the International Semantic Web Conference (ISWC). Karlsruhe, Germany, October 26-30, 2008.   link   paper  

"Redgraph: Navigating Semantic Web Networks using Virtual Reality", Harry Halpin, David J. Zielinski, Rachael Brady, Glenda Kelly. Presented as a poster at the IEEE Virtual Reality 2008 Conference, Reno NV. March 8-12. Published in the conference proceedings p 257-258. 2008. link   text
"Virtual Kitchen". Zachary Rosenthal. David J. Zielinski, Rachael Brady. Holton Thompson. This interactive 3D kitchen scene aims to understand how the test subjects react and behave in stressful immersive environments. The test subjects are asked to perform two tasks during their nine minutes in the environment, one of which is impossible to complete. The environment becomes increasingly stressful through irritating audio cues and the frustration of the inability to complete one of the tasks. 2007.

Interactive Installation Collaborations
"Link Media Wall" (2009) is a 4x12 high resolution display wall installed in the Link, which is found on the lower level of the Duke Perkins Library. The initial exhibit consisted of photos from the Duke Sidney D. Gamble collection. Later developement of the Link Media Wall was done by Todd Berreth and Matthew Kenney.   Website
"soundSpace" (2008-present) is a 30 x 30 foot sound space where free movement creates sounds based on images captured by a series of web cameras. Each camera sends images to a cluster of computers that determine how much motion is in the space. The specific location of motion determines which specific sounds are generated. Some movement will initiate familiar outdoors noises such as wind chimes, rain, lightning. Other sounds will include musical instruments such as the marimba, gongs, voices, synthesizers and a variety of percussion instruments. NC Museum of Life and Science.   Video   Museum Site
"Future and History of the Interface" (2007) was an interactive exhibit consisting of interview videos of technology pioneers and archival footage of early technology demos. The display of the videos and information was triggered by the users activity levels in the CIEMAS interactive studio. Presented during the HASTAC 2007 conference.  
"MIX TAPEStry" (2006) is an interactive, networked performance of music and graphics triggered by people moving under web cameras in the CIEMAS Studio and in the Krannert Art Museum at UIUC. The music is an adaptation of a hip hop track called "Lemonade" by Duke Adjunct Faculty member, Robi Roberts (a.k.a. J. Bully). J. Bully will perform his song while music and graphics by UIUC faculty member John Jennings are generated simultaneously at both locations.   Video   Article
"soundSense" (2004) explores the intersection of reality as it is perceived and communicated by humans and computers. "Reality" in this case means the positions and motions of people in the studio. Although computer systems can detect activity in the room, they are unable to convey it a manner that is as rich in nuance as the actual event. By representing electronically detected data as sound, we explore people’s ability to discover and articulate patterns of sound and then associate that with a vision of reality. Through soundSense , researchers hope to autonomously transform the "sentience" of the room into a sound communication understandable to people.   Article