David J. Zielinski

Research and Development Engineer
DiVE Virtual Reality Lab
Duke University

email: djzielin AT duke.edu
office: 410 Teer Engineering Building
mailing address: Box 90291. Teer Building - Room 305. Durham, NC 27708
phone: 919-660-5032

David J. Zielinski is a research and development engineer at Duke University in the Pratt School of Engineering. He is a member of the DiVE Virtual Reality Lab (2004-present), under the direction of Regis Kopper (2013-present) and previously Rachael Brady (2004-2012). He is an experienced software engineer who received his bachelors (2002) and masters (2004) degrees in Computer Science from the University of Illinois at Urbana-Champaign. He has experience in the design and implementation of virtual reality experiences. He is also experienced in the hardware configuration and systems integration issues of virtual reality systems. His virtual reality research interests include: software systems, interaction techniques, tracking systems, and domain specific applications of VR. He has additional interests in audio synthesis and physical computing via Arduino.

Google Scholar Page

Courses Taught
ISS320S - Introduction to Programming and User Interface Design in Unity 3D (Fall 2016)

Publications and Posters

"Evaluating the Effects of Image Persistence on Dynamic Target Acquisition in Low Frame Rate Virtual Environments" David J. Zielinski, Hrishikesh M. Rao, Nick Potter, Marc A. Sommer, Lawrence G. Appelbuam, Regis Kopper. IEEE Symposium on 3D User Interfaces (3DUI) 2016. PDF Video
"Evaluating the Effects of Image Persistence on Dynamic Target Acquisition in Low Frame Rate Virtual Environments" David J. Zielinski, Hrishikesh M. Rao, Nick Potter, Lawrence G. Appelbuam, Regis Kopper. Presented as poster at IEEE VR (2016). Honorable Mention for Best Poster Award. PDF Poster
"Medial prefrontal pathways for the contextual regulation of extinguished fear in humans". Fredrik Ahs, Philip A Kragel, David J Zielinski, Rachael Brady, Kevin S LaBar. NeuroImage 2015. Link
"Exploring the Effects of Image Persistence in Low Frame Rate Virtual Environments." David J. Zielinski, Hrishikesh M. Rao, Marc A. Sommer, Regis Kopper. IEEE VR (2015). PDF Slides Video
"Spatial proximity amplifies valence in emotional memory and defensive approach-avoidance." Fredrik Ahs, Joseph E. Dunsmoor, David Zielinski, and Kevin S. LaBar. Neuropsychologia (2014). link
"The Walk Again Project: An EEG/EMG training paradigm to control locomotion". Fabricio L Brasil, Renan C Moioli, Kyle Fast, Anthony L Lin, Nicole A Peretti, Angelo Takigami, Kenneth R Lyons, David J Zielinski, Lumy Sawaki, Sanjay S Joshi, Edgard Morya, Miguel AL Nicolelis. Poster at Society for Neuroscience (SFN) Conference. 2014.
"Comparison of Interactive Environments for the Archaeological Exploration of 3D Landscape Data". Rebecca Bennett, David J. Zielinski and Regis Kopper. Presented at the IEEE VIS Workshop on 3DVis. Paris, France. November 9th, 2014. PDF
"Extinction in multiple virtual reality contexts diminishes fear reinstatement in humans." Joseph E. Dunsmoor, Fredrik Ahs, David J. Zielinski, Kevin S. LaBar. Neurobiology of learning and memory (2014). link
"Enabling Closed-Source Applications for Virtual Reality via OpenGL Intercept Techniques". David Zielinski, Ryan McMahan, Solaiman Shokur, Edgard Morya, Regis Kopper. Presented at the SEARIS 2014 workshop. Co-located with the IEEE VR 2014 conference. PDF
"Comparative Study of Input Devices for a VR Mine Simulation". David Zielinski, Brendan Macdonald, Regis Kopper. presented as a poster at IEEE VR 2014. PDF
"Effects of virtual environment platforms on emotional responses". Kwanguk Kim, Zachary M Rosenthal, David J Zielinski, Rachael Brady. Computer Methods and Programs in Biomedicine. 2014 Jan 7. link
"Intercept Tags: Enhancing Intercept-based Systems". David J. Zielinski, Regis Kopper, Ryan P. McMahan, Wenjie Lu, Silvia Ferrari. VRST 2013. Nanyang Technological University, Singapore. October 6th-8th. Published in the conference proceedings. 2013.   paper pdf   demo video   talk video (duke)   talk slides
"ML2VR: Providing MATLAB Users an Easy Transition to Virtual Reality and Immersive Interactivity". David J. Zielinski, Ryan P. McMahan, Wenjie Lu, Silvia Ferrari. Presented as a poster at IEEE VR 2013. Candidate for best poster award. Pages 83-84. Published in the conference proceedings. 2013.   paper pdf   demo video   talk video (duke)   talk slides   github
"Evaluating Display Fidelity and Interaction Fidelity in a Virtual Reality Game". Ryan P. McMahan, Doug A. Bowman, David J. Zielinski, Rachael B. Brady. Presented as a full paper at IEEE VR 2012 conference. March 4th-8th, Orange County, CA. Published in the journal IEEE Transactions on Visualization and Computer Graphics (TVCG). p626-633. 2012.  link   pdf
"Comparison of Desktop, Head Mounted Display, and Six Wall Fully Immersive Systems using a Stressful Task". Kwanguk Kim, M. Zachary Rosenthal, David Zielinski and Rachael Brady. Presented as a Poster at the IEEE VR 2012 Conference, March 4th-8th. Orange Country, CA. Published in the conference proceedings: p143-144. 2012.   link   pdf
"Revealing context-specific conditioned fear memories with full immersion virtual reality". Nicole Huff, Jose Alba Hernandez, Matthew Fecteau, David Zielinski, Rachael Brady, Kevin S LaBar. Frontiers in Behavioral Neuroscience. 2011. link

"Shadow Walking: an Unencumbered Locomotion Technique for Systems with Under-Floor Projection". David J. Zielinski, Ryan P. McMahan, Rachael B. Brady. IEEE Virtual Reality Conference. p. 167-170, March 19-23, 2011.   paper   slides   video   code   talk video (duke)   link
"Human Fear Conditioning Conducted in Full Immersion 3-Dimensional Virtual Reality". Nicole C. Huff, David J. Zielinski, Matthew E. Fecteau, Rachael Brady, Kevin S. LaBar. JoVE - Journal of Visualized Experiments. 2010.  link
"KinImmerse: Macromolecular VR for NMR ensembles". Jeremy N Block, David J Zielinski, Vincent B Chen, Ian W Davis, E Claire Vinson, Rachael Brady, Jane S Richardson, and David C Richardson. Source Code for Biology and Medicine 2009.   link   paper   code
"VR Visualization as an Interdisciplinary Collaborative Data Exploration Tool for Large Eddy Simulations of Biosphere-Atmosphere Interactions". Gil Bohrer, Marcos Longo, David J. Zielinski, Rachael Brady. Proceedings of the International Symposium on Advances in Visual Computing (ISVC). Las Vegas, Nevada, December 1-3, 2008.   link   paper  video
"Exploring Semantic Social Networks using Virtual Reality". Harry Halpin, David J Zielinski, Rachael Brady, Glenda Kelly. Proceedings of the International Semantic Web Conference (ISWC). Karlsruhe, Germany, October 26-30, 2008.   link   paper   code
"Redgraph: Navigating Semantic Web Networks using Virtual Reality", Harry Halpin, David J. Zielinski, Rachael Brady, Glenda Kelly. Presented as a poster at the IEEE Virtual Reality 2008 Conference, Reno NV. March 8-12. Published in the conference proceedings p 257-258. 2008. link   text

Virtual Art Collaborations

"Did You Mean?" (2012) by Elizabeth Landesberg, Peter Lisignoli, Laura Dogget. Did You Mean explores the implications of using the word "word" to imply a single vessel or container of meaning. It invites DiVE participants to play around inside language itself, moving through the multiplicities of possibilities for meaning in a single word, and especially in translation. In Sanskrit, any number of words can be combined using hyphens and still be considered a single modifier, the longest of which has 54 component words joined by hyphens. Participants can select certain elements of this word with the wand and see and/or hear how they are usually defined on their own. Selecting a part of the word with the wand triggers a sound and/or an image, which build to a cacophony of sometimes meaningful, sometimes jarring collisions.
"BIDDIE'S BIG BENDER" (2012) by Laurenn McCubbin and Marika Borgeson. This project is an exploration of the sounds, spaces and visuals of Las Vegas, though the eyes of a senior citizen looking to have a good time in Sin City.
"Recollection Chorus" by Sarah Goetz (2011). Recollection Chorus is a network of floating memories at the Duke immersive Virtual Environment. One walks into a virtual space surrounded by white nodes, which play poetic and didactic media clips on subjects explored in my memory mind map. The media, collected from education films of the mid 20th century, will surround the user, creating a cacophony of commentary on memory.

"The Memory Tower" by Tim Senior (2011). The ‘Memory Tower’ is a full-immersion virtual reality environment exploring the neural mechanisms behind the consolidation and reorganization of memory traces during sleep within the brain. The nature of memory formation is reflected through the use of historical, architectural motifs, serving as visual metaphors for both the content and structure of memory traces. By creating a rich cityscape that encompasses more familiar notions of memory from our everyday encounters with the world around us, the work aims to extend beyond a reductionist, scientific description of memory content and to acknowledge notions of collective human experience and history.
"Desired Dwellings" by Fatimah Tuggar (2010). Desired Dwellings is an interactive virtual artwork that allows the viewer/participant to explore a variety of living experiences. Floating animated elements with audio hover above the heads of viewers: these can be grabbed and placed in any of the living situations. It is also possible to scale rotate and place multiples of any of the elements.

Talks and Lectures

"Recent Projects in the DiVE Virtual Reality Lab". David J. Zielinski. (3/6/2014). slides
"Introduction to Unity and MiddleVR". David J. Zielinski. (2/18/2014). An overview of the Unity game engine and how to utilize MiddleVR to make Virtual Reality applications. slides
"1-Bit Audio and the Arduino". David J. Zielnski (12/03/2012). A look into the sound generation methods of the Atari 2600 and also explanations and demonstrations of some of the 1-bit noise generation possibilities of the Arduino Uno.   slides   code

Interactive Installation Collaborations

"Link Media Wall" (2009) is a 4x12 high resolution display wall installed in the Link, which is found on the lower level of the Duke Perkins Library. The initial exhibit consisted of photos from the Duke Sidney D. Gamble collection. Current developement of the Link Media Wall is done by Todd Berreth.   Website
"soundSpace" (2008-2011) is a 30 x 30 foot sound space where free movement creates sounds based on images captured by a series of web cameras. Each camera sends images to a cluster of computers that determine how much motion is in the space. The specific location of motion determines which specific sounds are generated. Some movement will initiate familiar outdoors noises such as wind chimes, rain, lightning. Other sounds will include musical instruments such as the marimba, gongs, voices, synthesizers and a variety of percussion instruments. NC Museum of Life and Science.   Video
"Future and History of the Interface" (2007) was an interactive exhibit consisting of interview videos of technology pioneers and archival footage of early technology demos. The display of the videos and information was triggered by the users activity levels in the CIEMAS interactive studio. Presented during the HASTAC 2007 conference.   Website
"MIX TAPEStry" (2006) is an interactive, networked performance of music and graphics triggered by people moving under web cameras in the CIEMAS Studio and in the Krannert Art Museum at UIUC. The music is an adaptation of a hip hop track called "Lemonade" by Duke Adjunct Faculty member, Robi Roberts (a.k.a. J. Bully). J. Bully will perform his song while music and graphics by UIUC faculty member John Jennings are generated simultaneously at both locations.   Video   Website   Article
"soundSense" (2004) explores the intersection of reality as it is perceived and communicated by humans and computers. "Reality" in this case means the positions and motions of people in the studio. Although computer systems can detect activity in the room, they are unable to convey it a manner that is as rich in nuance as the actual event. By representing electronically detected data as sound, we explore people’s ability to discover and articulate patterns of sound and then associate that with a vision of reality. Through soundSense , researchers hope to autonomously transform the "sentience" of the room into a sound communication understandable to people.   Article