In late February and early March, New York Tech held its first-ever Future with Emergent Technologies Symposium, a two-part series for faculty and staff that focused on work inside and outside the classroom in virtual reality (VR), augmented reality (AR), mixed reality, and artificial intelligence (AI).
On February 26, New York Tech faculty and students were both inspiring and inspired during a number of presentations on teaching, learning, and researching in innovative, interdisciplinary projects using these evolving technologies. Two weeks later, on March 11, guest speaker John Cohn, Ph.D., chief scientist of the MIT-IBM Watson Artificial Intelligence Lab, shared insights at the symposium’s second event on how institutions like his teach, research, and learn with emergent technologies, how critical partnerships are formed, and ways to approach and encourage the adoption of new tech through his own experiences.
During the first session, Executive Vice President and Chief Operating Officer Jerry Balentine, D.O., and Provost and Vice President for Academic Affairs Junius Gonzales, M.D., M.B.A., opened the symposium, sharing their appreciation for the rich breadth of resources and talent at New York Tech in a variety of disciplines and for the organized efforts of many who help raise New York Tech’s reputation through their work and dedication.
Presenters, Stan Silverman, M.S., professor of instructional technology, and Randy Stout, Ph.D., assistant professor of biomedical sciences and director of the NYIT College of Osteopathic Medicine (NYITCOM) Center for Biomedical Innovation, who lead New York Tech’s new emergent technologies working group, noted that opportunities abound for faculty and students to work with these exciting technologies—even at the undergraduate level—regardless of discipline or major.
Presentations were followed by breakout rooms, where the presenters shared detailed information and answered questions about their work:
- Edward Piscitelli, a second-year student in NYITCOM, shared information on a VR project related to remote osteopathic manipulative medicine training that he worked on with Sheldon Yao, D.O., professor and chair of osteopathic manipulative medicine, and Erum Ahmed, a third-year NYITCOM student. He also spoke about another project on active range of motion, and a third with Alexander Lopez, J.D., associate professor of occupational therapy, related to gameplay to increase exercise quality and duration for children with autism. He also described a VR boxing trainer project with Associate Professor Adena Leder, D.O., to provide partial replacement for the in-person Rock Steady boxing program, and two projects using the Oculus headset with Stout.
- Professor of English Kevin LaGrandeur, Ph.D., spoke about the history of AI, what it means to our culture, and the ethical implications of science and technology on society. In his book, Surviving the Machine Age, he looks at intelligent technology and the transformation of work—in other words, what automation is doing to jobs. He also talked about neural lace (a project being researched by Elon Musk’s company Neuralink), which is an AI brain implant for medical and other uses, and the related ethical questions that go with it, including who can gain access to this technology.
- John Misak, Ph.D., assistant professor of English, presented on the integration of AR/VR in education and a research project he is working on with LaGrandeur that focuses on blending AR/VR, gaming, and literature instruction to encourage active engagement and help put learning into students’ hands. He presented a demo of “Perchance, an AR Hamlet Mystery” and explained how games can enhance learning when you design learning outcomes at the onset. Misak also spoke about Twine, an easy game coding language, that students can use to learn game design, practice code, learn narrative structure, how to gauge an audience, and how to create instructions.
- Pablo Lorenzo-Eiroa, M.Arch., introduced New York Tech’s recently introduced graduate architecture program, the M.S. in Architecture, Computational Technologies, and shared some high-tech projects taking place in the School of Architecture and Design. One involved rezoning New York City through Big Data AI and layering transportation and other data sets to optimize green areas, improve sustainable practices, and optimize traffic, ultimately creating a self-regulated urban environment. A second project, which will be displayed at this year’s Venice Biennale, utilizes structural and environmental simulation and AI and looks at environmental processes to design a particular space and expand the use of available technology like robotic systems, algorithms, simulations, etc., to create and fabricate architecture.
- Dominica Jamir, a graduate student in the UX/UI (user experience/user interface) design and development program, created a VR application to immerse students in their biology and chemistry lessons, allowing the molecules to come to life before their eyes. Her project, “Intellect VR: Learning in VR; The VR Experience in the Classroom,” was named “Best Practitioner Poster Proposal” at the 2020 International Conference of the Immersive Learning Research Network (iLRN). The application, which was featured in The Box, can be applied to multiple disciplines and is accessible for people with disabilities.
- Sung Kevin Park, assistant professor of digital art and design, showcased his project “Meet me at Woodstock: An Augmented Reality Tour Experience.” The one-hour one-mile walking tour, created for the Woodstock Museum for the festival’s 50th anniversary, utilized a 360-degree immersive panorama and immersive audio. Eight New York Tech students contributed to and tested the product.
- Avery Gilson, a laboratory technician in the Center for Biomedical Innovation, focused on robotics projects, including Dexter, a 3-D-printed robotic arm with computational power that allows precise power and movements; 3-D bioprinter applications that extrude organic materials; and a photogrammetry application designed to speed up photogrammetry scans and can capture all sides of an image. Gilson is also working to mathematically model the accuracy and precision of Dexter’s movements in collaboration with Professor of Physics Ben Ovryn.
- Randy Stout shared information on his research on VR and spatial computing for nanoscale biological discovery, which focuses on deficits of memory at a molecular level in 3-D. In a virtual environment, researchers like Stout can study how brain cells associate and share metabolic fuel, which impacts memory formation.
- Associate Professor of Computer Science Kiran Balagani, Ph.D., discussed AI and cyberspace, along with emerging threats and new opportunities related to applying AI models to improve password security and recover from cybersecurity attacks. He is also working on a project related to “Continuous Behavioral Authentication for Smartphone Security” to make sure the owner is the one using the phone. Another project, supported by a National Science Foundation grant that includes faculty researchers from the School of Health Professions and NYITCOM, looks at posture/movement to study behavioral identification to strengthen the security of mobile biometrics.
- Michael Nizich, Ph.D., director of the Entrepreneurship and Technology Innovation Center (ETIC) and adjunct assistant professor of computer science, gave an update on the ETIC Research Robot for Student Engagement and Learning Activities (E.R.R.S.E.L.A.). The web-based robotics project provides a collaborative research and engagement program to attract students across all disciplines and skill levels and allow them to participate without them having to be physically present at the ETIC on the Old Westbury campus. At the height of the pandemic, Nizich launched the iNTEREST program, a series of twice-weekly seminars conducted via Zoom to help train new and current participants in skills related to the project, from developing software and web applications to creating and managing databases. During the symposium, he invited faculty and their students to collaborate with his team to create healthcare applications for the robot, such as pushing a wheelchair or delivering medication in a sterile hospital environment.
- What spaces can support the release of post-traumatic patterns held by the body? That was the question that Aleksandra Zatorksa, a graduate student in the School of Architecture and Design, posed during a discussion on how architects and designers can use VR to test their plans for built environments. Using VR, Zatorska’s study monitors individuals’ neurological responses to design ideas, including whether proposed spaces may elicit feelings of trauma and comfort. Her study aims to ensure that spaces promote well-being and healing before they are created.
- Michael Uttendorfer, Ph.D., special assistant to the provost, shared an interactive VR mathematics lesson of a robot demonstrating radius, frequency, and other terminology. Uttendorfer, who is an instructional technology expert, explained how visual graphics and active participation can help enhance retention among students.