Essentials of Music Technology

Essentials of Music Technology

by Mark Ballora
ISBN-10:
0190240911
ISBN-13:
9780190240912
Pub. Date:
04/15/2015
Publisher:
Oxford University Press
ISBN-10:
0190240911
ISBN-13:
9780190240912
Pub. Date:
04/15/2015
Publisher:
Oxford University Press
Essentials of Music Technology

Essentials of Music Technology

by Mark Ballora
$128.99
Current price is , Original price is $128.99. You
$128.99 
  • SHIP THIS ITEM
    This item is available online through Marketplace sellers.
  • PICK UP IN STORE
    Check Availability at Nearby Stores
$90.32 
  • SHIP THIS ITEM

    Temporarily Out of Stock Online

    Please check back later for updated availability.

    • Condition: Good
    Note: Access code and/or supplemental material are not guaranteed to be included with used textbook.

This item is available online through Marketplace sellers.


Overview

Computers in music have gone from being a niche subject to becoming a ubiquitous presence that all music students are bound to encounter in their professional lives. Meant to serve as a general reference for music technology courses, Essentials of Music Technology provides an overview of musical acoustics, psychoacoustics, MIDI, digital audio, and sound recording.

Topics covered include:

* The Internet

* MIDI software

* The nature of digital audio storage

* Filters

* Effects

* Room acoustics

* Sampling and synthesis techniques

Product Details

ISBN-13: 9780190240912
Publisher: Oxford University Press
Publication date: 04/15/2015
Edition description: New Edition
Pages: 256
Product dimensions: 7.90(w) x 9.90(h) x 0.70(d)

About the Author

Mark Ballora has a background in theater arts, music composition, and multimedia production. He studied music technology at New York University and McGill University. He is Associate Professor of Music Technology at Pennsylvania State University.

Read an Excerpt

What Is Music Technology?

This newly defined component of music education means many things to many people. Music technology, a broad subject, has meanings that may differ for music educators, composers, performers, audio producers, electrical engineers, computer programmers, or perceptual psychologists. All of the music activities in these fields intersect in the personal computer. Within the span of the last decade or so, computers in music have gone from being a niche subject to becoming a ubiquitous presence that all music students are bound to encounter in their professional lives. Furthermore, the new and varied role of the computer in music making brings about surprising overlaps with all of these fields.

Prior to the 1980s, "music technology" (if the term was used at all) would most likely have referred to audio engineering, the conversion (transduction) of musical material into electricity for purposes of amplification, broadcasting, or recording. The first step in the process was the microphone, which performed the acoustic-to-electronic conversion. Once the musical material existed in the form of electrical current, it could be sent to an amplifier that would drive a set of speakers, thus relaying the material over a public address system. Alternatively, the electromagnetic radiation that resulted from the electrical current could be broadcast from an antenna for television or radio reception. Or, if the signal were to be recorded, the musicians were likely to be assembled in a recording studio, with a number of microphones strategically placed for optimal sound capture. The signals from the various microphones were combined in a mixer, which allowed a technician to adjust the relative volumes and stereo positions of each microphone's signal. With a mixer, it was also possible to send the signal to processing devices to adjust the character of the sound, making it, for example, sound as though it were occurring in a large room. Following effects processing, the mixed signals could be sent to tape for storage.

Other possible meanings for "music technology" might have included the use of synthesizers for composing electronic music, an activity attractive to musicians who had a penchant for electrical assembly or who had the means to employ technicians to create and maintain the machinery. Perhaps least known to the general public were those who worked in high-end research institutions who had access to computers, sharing time on these mysterious machines with engineers and rocket scientists and programming them to emit sounds and music.

The personal computer has generalized and expanded these models and affected every area of education. Music technology, implying "the use of computers as an aid to music making" is now a subject that all music educators must address in some way. Small desktop computers may now be part of every step of the musical production process just described, acting as performer, mixer, processor, and storage medium. This development has implications for all practitioners of music, regardless of their specialty. Whether they find themselves working at a school, a recording studio, or concert hall, musicians can count on finding computers at work in the production of musical activity. Performers are often expected to send CDs of their performances as part of applications for jobs. Educators are expected to employ the resources of the Internet and multimedia technology to teach students about music rudiments. Composers are expected to provide performers with laser-printed scores and parts. Thus, knowledge of music technology is becoming a core skill of musical training, along with history, figured bass, and Roman numeral notation.

Purpose of This Book

Things should be made as simple as possible, but no simpler.
— Albert Einstein

With any learning endeavor, it is typically the basics that are most difficult to master. This book is meant to provide an overview of essential elements of acoustics, MIDI, digital audio, and sound recording so that students may understand what musical elements may be stored in a computer and what type of manipulations and representations of musical information are possible. It is written in an attempt to find a balance between simple and straightforward presentations and descriptions that are thorough enough to explain the material without "dumbing it down."

The book is not meant to be a substitute for a qualified instructor. A performer cannot master an instrument without the help of an experienced musician, just as an athlete cannot excel at a sport without the guidance of a coach. But even experienced performers and athletes find it useful to consult a fingering chart or a rule book as a reference. This book is intended to play a similar role for a student learning some aspect of music technology. Different sections will be relevant for different types of projects or different levels of learning. It is broken into short sections in order to allow an instructor to assign reading selectively to focus on areas that are important for a given group of students.

The only assumptions are that students are familiar with a computer operating system and fundamentals of music. Certain concepts are best presented with equations, but no advanced mathematical training is necessary to understand the main points. Computers deal with information in the form of numbers, and different programs may require information to be entered differently. A given program may ask for frequency values or musical pitches, for loudness in decibels or in arbitrary units. Mathematics is not presented for its own sake, but rather to give students alternative views of how to understand and work with music in a computer environment. The cognitive psychologist Marvin Minsky has stated that a thing or idea seems meaningful only when we have a variety of different ways to represent it, providing different perspectives and different associations. Viewing music through the lenses of acoustics, physics, or computer science has the potential to yield rich rewards in the form of new perspectives and associations that they bring to the meaning of music.

Table of Contents

Preface

Chapter 1 Basic Acoustics
The Nature of Sound Events
Wave Propagation
Simple Harmonic Motion
Characteristics of Waves
Refraction and Reflection
Superposition
Standing Waves, Resonant Frequencies, and Harmonics
Phase

Speed and Velocity

Chapter 2 Music and Acoustics
What Is the Difference Between Musical Sound and Noise?
Properties of Musical Sound
Frequency/Pitch
Frequency Is Objective, Pitch Is Subjective
Human Pitch Perception is Logarithmic

Loudness
Power
Amplitude
Intensity
Timbre

Chapter 3 Acoustic Factors in Combination: Perceptual Issues
Sound in Time
Localization of Natural Events
Simulated Localization in Audio Systems
Mismatches Between Measurement and Perception
Phase
Timbre
Loudness

Conclusion

Chapter 4 Introduction to Computers
Multimedia
The Internet
The World Wide Web
Caveat Emptor
Streaming Media
The Web and Music Research


Chapter 5 Representing Numbers
Numbers Are Power
Of What Value Power?
Numbers in Computers
The Binary Number System
Some Essential Technology
The Hexadecimal Number System
Integers and Floating Points

Chapter 6 Introduction to MIDI
A Brief Historical Background
What MIDI Is and What It Is Not
MIDI Compromises
MIDI Channels

Computers and MIDI
Central Traffic Control
Sequencing Software
Notation Software
Computer-Aided Instruction (CAI) Software
Accompaniment Software
Editor/Librarian Software
Connecting MIDI Instruments
Basic Configurations
Computer Configurations
The Computer as Sound Generator

Chapter 7 The MIDI Language

The MIDI Language, 1: Channel Voice Messages
Structure of Channel Voice Messages
Channel Voice Message Types

The MIDI Language, 2: MIDI Modes
Channel Mode Messages
Other Types of Mode Messages

The MIDI Language, 3: System-Level Messages
System Common Messages
System Real-Time Messages
System Exclusive Messages

MIDI and Time
MIDI Synchronization
MIDI Clock
Song Position Pointer
Frequency Shift Keying (FSK)
MIDI Time Code (MTC)

MIDI Implementation Charts

Chapter 8 MIDI and More
Nonkeyboard MIDI Instruments
The Challenge Imposed by MIDI
MIDI String Instruments
MIDI Wind Instruments
MIDI Percussion Instruments

Additions to the MIDI Protocol
Standard MIDI Files
General MIDI
Multi Mode
Karaoke Files
GS MIDI and XG MIDI
MIDI Machine Control (MMC) and MIDI Show
Control (MSC)


Chapter 9 Digital Audio
Introduction
Digitizing Audio—The Big Picture
The Central Problem
Digital Conversion
Does Digital Sound as Good as Analog?

Characteristics of Digital Audio
Sampling Rate
The Sampling Rate of CD Audio and Its Origin
Quantization
Quantization vs. Sampling Rate
The Size of Audio Files

Filtering
What Is Filtering?
Filter Types

The Digital Filtering Process
Feedforward vs. Feedback Filters
Lowpass Filters
Highpass Filters
Bandpass and Band-Reject Filters
Other Filter Characteristics

The Digital Recording and Playback Process
Recording
Playback

Chapter 10 Working with Digital Audio: Processing and Storage

Spectral Representation
0 Hz = Direct Current
Spectra of Digital Signals
Convolution
Time Domain Localization vs. Spectral Resolution

Oversampling and Noiseshaping
Perceptual Coding
Psychoacoustics
Masking
Data Reduction
Storage Media
Compact Disc
Digital Audio Tape (DAT)
MiniDisc
DVD
DVD-Audio
Super Audio CD
Hard-Disk Recording—The Convergence of Multimedia
Digital Workstations
Transferring Data Among Devices
Audio Files


Chapter 11 Acquiring Audio
Room Acoustics
Direct and Reflected Sound
Large Performance Spaces
Small Performance Spaces

Microphones
Receptor Types
Transducer Types
Directionality

Microphone Configurations
Time-of-Arrival Stereophony
Intensity Stereophony
Near-Coincident Configurations
Support Microphones
5.1 Channel Configurations

Chapter 12 Treating and Mixing Audio

Effects: Introduction
Effects = Filtering
Filtering = Delay
Effects Processors and Word Length

Long Delays: Audible Echoes
Simple Delay
Multitap Delay
Feedback Delay

Building Blocks of Delay-Based Effects: Comb and Allpass Filters
Comb Filters
Allpass Filters

Delay-Based Effects
Flanging
Chorusing
Phase Shifting
Reverberation

Non-Delay-Based Effects
Ring/Amplitude Modulation
Compression/Limiting and Expansion/Noise Gating

Mixing
Channels
Phantom Power
Channel Insert
Equalization
Channel Fader
Mixer Buses
Auxiliary Buses
Mute/Solo
Pan
Output Buses
A Final Note on Levels


Chapter 13 Digital Instruments
Samplers
Sampler Variations
Synthesizers
Sound Fonts
Groove Boxes and Looping Software
Tracking Software

Software Synthesis
Building Blocks of Sound Synthesis
Additive Synthesis
Subtractive Synthesis
Phase Modulation
Vector Synthesis
Latency


Afterword
Appendix 1: Suggested Class Projects
Appendix 2: Web Page Template with MIDI File
References
Index

Preface

What Is Music Technology?

This newly defined component of music education means many things to many people. Music technology, a broad subject, has meanings that may differ for music educators, composers, performers, audio producers, electrical engineers, computer programmers, or perceptual psychologists. All of the music activities in these fields intersect in the personal computer. Within the span of the last decade or so, computers in music have gone from being a niche subject to becoming a ubiquitous presence that all music students are bound to encounter in their professional lives. Furthermore, the new and varied role of the computer in music making brings about surprising overlaps with all of these fields.

Prior to the 1980s, "music technology" (if the term was used at all) would most likely have referred to audio engineering, the conversion (transduction) of musical material into electricity for purposes of amplification, broadcasting, or recording. The first step in the process was the microphone, which performed the acoustic-to-electronic conversion. Once the musical material existed in the form of electrical current, it could be sent to an amplifier that would drive a set of speakers, thus relaying the material over a public address system. Alternatively, the electromagnetic radiation that resulted from the electrical current could be broadcast from an antenna for television or radio reception. Or, if the signal were to be recorded, the musicians were likely to be assembled in a recording studio, with a number of microphones strategically placed for optimal sound capture. The signals from the various microphones were combined in a mixer, whichallowed a technician to adjust the relative volumes and stereo positions of each microphone's signal. With a mixer, it was also possible to send the signal to processing devices to adjust the character of the sound, making it, for example, sound as though it were occurring in a large room. Following effects processing, the mixed signals could be sent to tape for storage.

Other possible meanings for "music technology" might have included the use of synthesizers for composing electronic music, an activity attractive to musicians who had a penchant for electrical assembly or who had the means to employ technicians to create and maintain the machinery. Perhaps least known to the general public were those who worked in high-end research institutions who had access to computers, sharing time on these mysterious machines with engineers and rocket scientists and programming them to emit sounds and music.

The personal computer has generalized and expanded these models and affected every area of education. Music technology, implying "the use of computers as an aid to music making" is now a subject that all music educators must address in some way. Small desktop computers may now be part of every step of the musical production process just described, acting as performer, mixer, processor, and storage medium. This development has implications for all practitioners of music, regardless of their specialty. Whether they find themselves working at a school, a recording studio, or concert hall, musicians can count on finding computers at work in the production of musical activity. Performers are often expected to send CDs of their performances as part of applications for jobs. Educators are expected to employ the resources of the Internet and multimedia technology to teach students about music rudiments. Composers are expected to provide performers with laser-printed scores and parts. Thus, knowledge of music technology is becoming a core skill of musical training, along with history, figured bass, and Roman numeral notation.

Purpose of This Book

Things should be made as simple as possible, but no simpler.
— Albert Einstein

With any learning endeavor, it is typically the basics that are most difficult to master. This book is meant to provide an overview of essential elements of acoustics, MIDI, digital audio, and sound recording so that students may understand what musical elements may be stored in a computer and what type of manipulations and representations of musical information are possible. It is written in an attempt to find a balance between simple and straightforward presentations and descriptions that are thorough enough to explain the material without "dumbing it down."

The book is not meant to be a substitute for a qualified instructor. A performer cannot master an instrument without the help of an experienced musician, just as an athlete cannot excel at a sport without the guidance of a coach. But even experienced performers and athletes find it useful to consult a fingering chart or a rule book as a reference. This book is intended to play a similar role for a student learning some aspect of music technology. Different sections will be relevant for different types of projects or different levels of learning. It is broken into short sections in order to allow an instructor to assign reading selectively to focus on areas that are important for a given group of students.

The only assumptions are that students are familiar with a computer operating system and fundamentals of music. Certain concepts are best presented with equations, but no advanced mathematical training is necessary to understand the main points. Computers deal with information in the form of numbers, and different programs may require information to be entered differently. A given program may ask for frequency values or musical pitches, for loudness in decibels or in arbitrary units. Mathematics is not presented for its own sake, but rather to give students alternative views of how to understand and work with music in a computer environment. The cognitive psychologist Marvin Minsky has stated that a thing or idea seems meaningful only when we have a variety of different ways to represent it, providing different perspectives and different associations. Viewing music through the lenses of acoustics, physics, or computer science has the potential to yield rich rewards in the form of new perspectives and associations that they bring to the meaning of music.

From the B&N Reads Blog

Customer Reviews