David J. Zielinski
Research and Development Engineer
DiVE Virtual Reality Lab
Email: djzielin AT duke.edu
Office: 410 Teer Engineering Building
Mail: Box 90291. Teer Building - Room 305. Durham, NC 27708
Google scholar page: here
|ISS320S, VMS 326S||Introduction to Programming and User Interface Design in Unity3D.||Instructor||
2017 Fall - final projects
2016 Fall - final projects
|ME 555.05, ISS 590.01||Virtual Reality Systems Research and Design.||Guest lecturer and implementation support for students||2017 Spring
|"Neurophysiology of Visual-motor Learning during a Simulated Marksmanship Task in Immersive Virtual Reality" Jillian M. Clements, Regis Kopper, David J. Zielinski, Hrishikesh Rao, Marc A. Sommer, Elayna Kirsch, Boyla Mainsah, Leslie M. Collins, Lawrence G. Appelbaum. Forthcoming IEEE VR 2018.||"Sensorimotor learning during a marksmanship task in immersive virtual reality". Hrishikesh M. Rao1, Rajan Khanna, David J. Zielinski, Yvonne Lu, Jillian M. Clements, Nicholas D. Potter, Marc A. Sommer, Regis Kopper, and Lawrence Gregory Appelbaum. Front. Psychol. | doi: 10.3389/fpsyg.2018.00058. January 2018. link||
"Specimen box: A tangible interaction technique for world-fixed virtual reality displays".
David J. Zielinski, Derek Nankivil, Regis Kopper.
In 2017 IEEE Symposium on 3D User Interfaces (3DUI). IEEE, 2017.
"6 degrees-of-freedom manipulation with a transparent, tangible object in world-fixed virtual reality displays". David J. Zielinski, Derek Nankivil, Regis Kopper. Presented as a poster at 2017 IEEE Virtual Reality (VR). IEEE, 2017. PDF Poster
Patented as: "stems and Methods for Using Sensing of Real Object Position, Trajectory, or Attitude to Enable User Interaction with a Virtual Object." PCT/US2017/053803.
|"Wireless, Web-Based Interactive Control of Optical Coherence Tomography with Mobile Devices." Mehta, Rajvi, Derek Nankivil, David J. Zielinski, Gar Waterman, Brenton Keller, Alexander T. Limkakeng, Regis Kopper, Joseph A. Izatt, and Anthony N. Kuo. Translational vision science & technology 6, no. 1 (2017): 5-5. Link|
|Neurology Surgery Simulator (EVD Placement). Utilizing Microsoft HoloLens. Summer 2017. Patented as: "Augmented Reality-Based Navigation for Use in Surgical and Non-Surgical Procedures". PCT/US18/17381|
"Evaluating the Effects of Image Persistence on Dynamic Target
Acquisition in Low Frame Rate Virtual Environments"
David J. Zielinski, Hrishikesh M. Rao, Nick Potter, Marc A.
Sommer, Lawrence G.
Appelbuam, Regis Kopper.
IEEE Symposium on 3D User Interfaces (3DUI) 2016.
"Evaluating the Effects of Image Persistence on Dynamic Target Acquisition in Low Frame Rate Virtual Environments" David J. Zielinski, Hrishikesh M. Rao, Nick Potter, Lawrence G. Appelbuam, Regis Kopper. Presented as poster at IEEE VR (2016). Honorable Mention for Best Poster Award. PDF Poster
|"Medial prefrontal pathways for the contextual regulation of extinguished fear in humans". Fredrik Ahs, Philip A Kragel, David J Zielinski, Rachael Brady, Kevin S LaBar. NeuroImage 2015. Link|
|"Exploring the Effects of Image Persistence in Low Frame Rate Virtual Environments." David J. Zielinski, Hrishikesh M. Rao, Marc A. Sommer, Regis Kopper. IEEE VR (2015). PDF Slides Video|
|"Spatial proximity amplifies valence in emotional memory and defensive approach-avoidance." Fredrik Ahs, Joseph E. Dunsmoor, David Zielinski, and Kevin S. LaBar. Neuropsychologia (2014). link|
|"The Walk Again Project: An EEG/EMG training paradigm to control locomotion". Fabricio L Brasil, Renan C Moioli, Kyle Fast, Anthony L Lin, Nicole A Peretti, Angelo Takigami, Kenneth R Lyons, David J Zielinski, Lumy Sawaki, Sanjay S Joshi, Edgard Morya, Miguel AL Nicolelis. Poster at Society for Neuroscience (SFN) Conference. 2014.|
|"Comparison of Interactive Environments for the Archaeological Exploration of 3D Landscape Data". Rebecca Bennett, David J. Zielinski and Regis Kopper. Presented at the IEEE VIS Workshop on 3DVis. Paris, France. November 9th, 2014. PDF|
|"Extinction in multiple virtual reality contexts diminishes fear reinstatement in humans." Joseph E. Dunsmoor, Fredrik Ahs, David J. Zielinski, Kevin S. LaBar. Neurobiology of learning and memory (2014). link|
|"Enabling Closed-Source Applications for Virtual Reality via OpenGL Intercept Techniques". David Zielinski, Ryan McMahan, Solaiman Shokur, Edgard Morya, Regis Kopper. Presented at the SEARIS 2014 workshop. Co-located with the IEEE VR 2014 conference. PDF|
|"Comparative Study of Input Devices for a VR Mine Simulation". David Zielinski, Brendan Macdonald, Regis Kopper. presented as a poster at IEEE VR 2014. PDF|
"Effects of virtual environment platforms on emotional responses".
Kim, Zachary M Rosenthal, David J Zielinski, Rachael Brady. Computer
Methods and Programs in Biomedicine. 2014 Jan 7.
"Comparison of Desktop, Head Mounted Display, and Six Wall Fully Immersive Systems using a Stressful Task". Kwanguk Kim, M. Zachary Rosenthal, David Zielinski and Rachael Brady. Presented as a Poster at the IEEE VR 2012 Conference, March 4th-8th. Orange Country, CA. Published in the conference proceedings: p143-144. 2012.   link   pdf
|"Intercept Tags: Enhancing Intercept-based Systems". David J. Zielinski, Regis Kopper, Ryan P. McMahan, Wenjie Lu, Silvia Ferrari. VRST 2013. Nanyang Technological University, Singapore. October 6th-8th. Published in the conference proceedings. 2013.   paper pdf   demo video   talk video (duke)   talk slides|
|"ML2VR: Providing MATLAB Users an Easy Transition to Virtual Reality and Immersive Interactivity". David J. Zielinski, Ryan P. McMahan, Wenjie Lu, Silvia Ferrari. Presented as a poster at IEEE VR 2013. Candidate for best poster award. Pages 83-84. Published in the conference proceedings. 2013.   paper pdf   demo video   talk video (duke)   talk slides   github|
|"Evaluating Display Fidelity and Interaction Fidelity in a Virtual Reality Game". Ryan P. McMahan, Doug A. Bowman, David J. Zielinski, Rachael B. Brady. Presented as a full paper at IEEE VR 2012 conference. March 4th-8th, Orange County, CA. Published in the journal IEEE Transactions on Visualization and Computer Graphics (TVCG). p626-633. 2012.  link   pdf|
"Revealing context-specific conditioned fear memories with full immersion
virtual reality". Nicole Huff, Jose Alba Hernandez, Matthew Fecteau, David
Zielinski, Rachael Brady, Kevin S LaBar. Frontiers in Behavioral Neuroscience.
"Human Fear Conditioning Conducted in Full Immersion 3-Dimensional Virtual Reality". Nicole C. Huff, David J. Zielinski, Matthew E. Fecteau, Rachael Brady, Kevin S. LaBar. JoVE - Journal of Visualized Experiments. 2010.  link
|"Shadow Walking: an Unencumbered Locomotion Technique for Systems with Under-Floor Projection". David J. Zielinski, Ryan P. McMahan, Rachael B. Brady. IEEE Virtual Reality Conference. p. 167-170, March 19-23, 2011.   paper   slides   video   code   talk video (duke)   link|
|"KinImmerse: Macromolecular VR for NMR ensembles". Jeremy N Block, David J Zielinski, Vincent B Chen, Ian W Davis, E Claire Vinson, Rachael Brady, Jane S Richardson, and David C Richardson. Source Code for Biology and Medicine 2009.   link   paper   code|
|"VR Visualization as an Interdisciplinary Collaborative Data Exploration Tool for Large Eddy Simulations of Biosphere-Atmosphere Interactions". Gil Bohrer, Marcos Longo, David J. Zielinski, Rachael Brady. Proceedings of the International Symposium on Advances in Visual Computing (ISVC). Las Vegas, Nevada, December 1-3, 2008.   link   paper  video|
"Exploring Semantic Social Networks using Virtual Reality". Harry
Halpin, David J Zielinski, Rachael Brady, Glenda Kelly.
Proceedings of the International Semantic Web Conference (ISWC).
Karlsruhe, Germany, October 26-30, 2008.   link
"Redgraph: Navigating Semantic Web Networks using Virtual Reality", Harry Halpin, David J. Zielinski, Rachael Brady, Glenda Kelly. Presented as a poster at the IEEE Virtual Reality 2008 Conference, Reno NV. March 8-12. Published in the conference proceedings p 257-258. 2008. link   text
|"Exploring Proteins". NEUROBIO 719 - Concepts in Neuroscience I. Professor Jorg Grandl. Spring 2017.|
|"Exploring Building Designs". CEE 429 - Integrated Structural Design. Professor Joe Nadeau. Student final projects. Spring 2017.|
|"Visualizing Blood Flow Through Arteries". BME 307: Transport Phenomena in Biological Systems. Professor Amanda Randles. Student final projects. Spring 2017.|
|"Dig@IT: Virtual Reality in Archaeology". Virtual visit to dig site in Catalhoyuk, Turkey. Emmanuel Shiferaw, Cheng Ma , Regis Kopper, Maurizio Forte, Nicola Lercari, David Zielinski (completed DiVE conversion). Spring 2017 Project Site|
|"Reimagining a Medieval Choir Screen". Virtual visit to Santa Chiara Church in Naples. Andrea Basso, Elisa Castagna, Lucas Giles, David Zielinski, Caroline Bruzelius. Fall 2016. Article|
|"Smoking and How it Changes the Brain - An Adventure in the Duke Immersive Virtual Environment". Rochelle D. Schwartz-Bloom, Rachael Brady. David Zielinski (programming). Holton Thompson (modeling). Jonathan Bloom (narration). 2012. Video|
|"Did You Mean?" (2012) by Elizabeth Landesberg, Peter Lisignoli, Laura Dogget. Did You Mean explores the implications of using the word "word" to imply a single vessel or container of meaning. It invites DiVE participants to play around inside language itself, moving through the multiplicities of possibilities for meaning in a single word, and especially in translation. In Sanskrit, any number of words can be combined using hyphens and still be considered a single modifier, the longest of which has 54 component words joined by hyphens. Participants can select certain elements of this word with the wand and see and/or hear how they are usually defined on their own. Selecting a part of the word with the wand triggers a sound and/or an image, which build to a cacophony of sometimes meaningful, sometimes jarring collisions.|
|"BIDDIE'S BIG BENDER" (2012) by Laurenn McCubbin and Marika Borgeson. This project is an exploration of the sounds, spaces and visuals of Las Vegas, though the eyes of a senior citizen looking to have a good time in Sin City.|
|"Recollection Chorus" by Sarah Goetz (2011).
Recollection Chorus is a
network of floating memories at the Duke immersive Virtual Environment.
One walks into a virtual space surrounded by white nodes, which play poetic
and didactic media clips on subjects explored in my memory mind map. The media,
collected from education films of the mid 20th century, will surround the
user, creating a cacophony of commentary on memory.
|"The Memory Tower" by Tim Senior (2011). The ‘Memory Tower’ is a full-immersion virtual reality environment exploring the neural mechanisms behind the consolidation and reorganization of memory traces during sleep within the brain. The nature of memory formation is reflected through the use of historical, architectural motifs, serving as visual metaphors for both the content and structure of memory traces. By creating a rich cityscape that encompasses more familiar notions of memory from our everyday encounters with the world around us, the work aims to extend beyond a reductionist, scientific description of memory content and to acknowledge notions of collective human experience and history.|
|"Desired Dwellings" by Fatimah Tuggar (2010).
Desired Dwellings is an
virtual artwork that allows the
viewer/participant to explore a variety of living experiences. Floating
animated elements with audio hover above the heads of viewers: these can
be grabbed and placed in any of the living situations. It is also possible
to scale rotate and place multiples of any of the elements. |
|"CMAC Rendezvous - DiVE Lab Presentation". David J. Zielinski. (9/14/2017). slides|
|"What's been happening in the DiVE?". Duke Friday Forum 4/17/2015. Regis Kopper and David Zielinski. (4/17/2015). slides|
|"Recent Projects in the DiVE Virtual Reality Lab". Duke Media Arts + Sciences Rendezvous. March 6th, 2014. David J. Zielinski. (3/6/2014). slides|
|"Introduction to Unity and MiddleVR". David J. Zielinski. (2/18/2014). An overview of the Unity game engine and how to utilize MiddleVR to make Virtual Reality applications. slides|
|"1-Bit Audio and the Arduino". David J. Zielnski (12/03/2012). A look into the sound generation methods of the Atari 2600 and also explanations and demonstrations of some of the 1-bit noise generation possibilities of the Arduino Uno.   slides   code|
|"Link Media Wall" (2009) is a 4x12 high resolution display wall installed in the Link, which is found on the lower level of the Duke Perkins Library. The initial exhibit consisted of photos from the Duke Sidney D. Gamble collection. Later developement of the Link Media Wall was done by Todd Berreth.   Website|
|"soundSpace" (2008-present) is a 30 x 30 foot sound space where free movement creates sounds based on images captured by a series of web cameras. Each camera sends images to a cluster of computers that determine how much motion is in the space. The specific location of motion determines which specific sounds are generated. Some movement will initiate familiar outdoors noises such as wind chimes, rain, lightning. Other sounds will include musical instruments such as the marimba, gongs, voices, synthesizers and a variety of percussion instruments. NC Museum of Life and Science.   Video   Museum Site|
|"Future and History of the Interface" (2007) was an interactive exhibit consisting of interview videos of technology pioneers and archival footage of early technology demos. The display of the videos and information was triggered by the users activity levels in the CIEMAS interactive studio. Presented during the HASTAC 2007 conference.  |
|"MIX TAPEStry" (2006) is an interactive, networked performance of music and graphics triggered by people moving under web cameras in the CIEMAS Studio and in the Krannert Art Museum at UIUC. The music is an adaptation of a hip hop track called "Lemonade" by Duke Adjunct Faculty member, Robi Roberts (a.k.a. J. Bully). J. Bully will perform his song while music and graphics by UIUC faculty member John Jennings are generated simultaneously at both locations.   Video   Article|
|"soundSense" (2004) explores the intersection of reality as it is perceived and communicated by humans and computers. "Reality" in this case means the positions and motions of people in the studio. Although computer systems can detect activity in the room, they are unable to convey it a manner that is as rich in nuance as the actual event. By representing electronically detected data as sound, we explore people’s ability to discover and articulate patterns of sound and then associate that with a vision of reality. Through soundSense , researchers hope to autonomously transform the "sentience" of the room into a sound communication understandable to people.   Article|