plus📅TriangleTriangleclockplusemailoutfaxinstagramlocationGroup 13phonepinterestplayplus💲search

Commencing Fall 2018, the Peabody Conservatory will offer a four-year Bachelor of Music degree in Music for New Media. This degree is designed for students with a particular interest in composing and producing music for emerging areas of non-linear entertainment such as computer games, virtual reality, augmented reality, and 3D spatialized sound for location-based experiences.

Once esoteric, this area of study is now rapidly becoming mainstream as devices like Oculus Rift, Google Glass, Microsoft HoloLens, and Samsung Gear arrive in the consumer marketplace. Building on the audio techniques developed for established games consoles and computers, new career paths are now opening up for qualified music graduates well versed in the techniques and development platforms for these devices. Peabody is fully committed to developing the skills and creativity required, and expects to help graduates pursue a wide range of placements in the interactive music workplace.

Students will study the fundamentals of music’s function within visual media, analyzing how and why music is used to enhance the dramatic and emotional effect (e.g. by establishing time and place, creating empathy, tension or suspicion, and adding subtext). As they transition from linear to non-linear media, they will learn to work in industry-standard interactive audio programming environments such as Unity3D and the UnrealEngine. In addition to standard game platforms, computers and handsets, we will investigate output devices ranging from holophonic headphones to multi-speaker arrays.

New Media studies will enhance courses in the core building blocks of Peabody’s music degree—including composition, theory, ear training, sight-reading and arrangement—amounting to a well-rounded degree qualification. Peabody’s extensive Recording Arts department will provide ample opportunities to record and mix music in its state of the art studios. Compositions will be scored for traditional instruments but realized through software emulation.

Individual lessons in the third and fourth years of the degree during which students work with a mentor one-on-one to advance their technique and refine understanding and execution of skills needed and produce the Capstone Project. Capstone projects undertaken in the final year of the degree will take advantage of the wide spectrum of unique collaborations possible at Johns Hopkins: students might choose to work with the JH medical campus to utilize EEG brainwave data to manipulate music, or to develop therapeutic applications for stroke victims; to program audio software from the ground up at the Whiting School of Engineering, or to collaborate with writers and directors at the JHU/MICA Film Centre to create real-time soundtracks for immersive video experiences.

New Media Seminars invite guests to present their work and lessons from their experience to students in the department, either in-person or via video conference. Each semester will include presentations on the topics of software and hardware tools and creative techniques, as well as “real world” insight into the business of composing for new media.


Thomas Dolby

The list of breakthrough innovations in Thomas Dolby’s 35-year career is continuous. As an early MTV icon he blazed a trail for electronic music with his imaginative videos. The same year as his own record reached the top of the pop/dance charts, he co-wrote and produced the first ever platinum-selling rap 12” single “Magic’s Wand” by Whodini.