Solved

Generating wave forms

Posted on 1998-11-19
14
387 Views
Last Modified: 2010-04-04
This is a tricky one. I need to generate wave forms and play them via the sound card. The data (x, y) should be used from an array and it should NOT be a simple sine, saw or any other common wave form. If this isn't hard enough I want to do this in STEREO with different arrays for each channel.

Maybe you don't have a complete solution, but I want to know as much as you know.
0
Comment
Question by:technologist
  • 4
  • 4
  • 2
  • +2
14 Comments
 
LVL 3

Expert Comment

by:Matvey
Comment Utility
0
 
LVL 3

Expert Comment

by:Matvey
Comment Utility
Now you know as much as I know...
0
 
LVL 8

Expert Comment

by:ZifNab
Comment Utility
0
 
LVL 5

Expert Comment

by:scrapdog
Comment Utility
If you don't want a simple waveform, combine several simple waves together.
0
 
LVL 2

Expert Comment

by:trillo
Comment Utility
I've ssurfed in the sites above metioned and the are all about share or freeware, or how to use the multimedia functions.
When reading the question, I thaught you were looking for more techincal help (the wave format, how to build wave buffers (mono and stereo), which bytes in the buffers are for the left and for the right channels, etc... )... Am I wrong?

0
 
LVL 8

Expert Comment

by:ZifNab
Comment Utility
trillo, first url to 00000022.htm gives info about wave format.?
0
 
LVL 5

Expert Comment

by:scrapdog
Comment Utility
This is more a question of mathematics than it is files and program code...but that depends on what technologist has to say about it:

Is it safe to assume that you already know the file format?  Is it safe to assume you know how to play a wave?  

Do you just want to know how to generate the waves and place them in an array (or two), and then go from there?
0
Maximize Your Threat Intelligence Reporting

Reporting is one of the most important and least talked about aspects of a world-class threat intelligence program. Here’s how to do it right.

 

Author Comment

by:technologist
Comment Utility
I know the mathematics (heavy code!) and I've got all the data I want to play. The problem is that I want to play the wave in REAL TIME (Not only "compiling" a filename.wav to play later, but I want to do this too) and send it to the sound card. The purpose is to reflect a laser beam by using two loudspeakers for x and y axis and "write" figures and letters. I have tested this with music and it works great! Now I want to do this in the computer.

So what I want to do is to place the array in a buffer (if that's the way to do it) and then play it. In stereo.


0
 
LVL 3

Expert Comment

by:Matvey
Comment Utility
Try DelphiX, which is Delphi implementation of DirectX that you can find anywhere on the Delphi pages. I think it has exactly the component for this, although DirectX isn't the only way at all. You can find C sources all around of such progys. I can send you all some of them, if it helps. I also think that if you see the audio pages on the Delphi sites, you see some components that might be implementing writing to the buffer in real-time. Download all of them (there is not much), and look in the sources. Abstract, huh? Well, I'll look around for more, sorry...
0
 
LVL 3

Expert Comment

by:Matvey
Comment Utility
0
 
LVL 2

Expert Comment

by:trillo
Comment Utility
technologist: Give me some hours and I'll try to write everithing you need to start coding.
0
 
LVL 2

Accepted Solution

by:
trillo earned 200 total points
Comment Utility
Here we go!
All wave data is stored in 8-bit bytes. The bytes of multiple-byte values are stored with the low-order (ie, least significant) bytes first. Data bits are as follows (ie, shown with bit numbers on top):


         7  6  5  4  3  2  1  0
       +-----------------------+
 char: | lsb               msb |
       +-----------------------+



                  7  6  5  4  3  2  1  0 15 14 13 12 11 10  9  8
                +-----------------------+-----------------------+
short (2 byte): | lsb     byte 0        |       byte 1      msb |
                +-----------------------+-----------------------+


A WAVE file is a collection of a number of different types of chunks. There is a required Format ("fmt ") chunk which contains important parameters describing the waveform, such as its sample rate. The Data chunk, which contains the actual waveform data, is also required. All other chunks are optional. Among the other optional chunks are ones which define cue points, list instrument parameters, store application-specific information, etc.

All applications that use WAVE must be able to read the 2 required chunks and can choose to selectively ignore the optional chunks. A program that copies a WAVE should copy all of the chunks in the WAVE, even those it chooses not to interpret.

There are no restrictions upon the order of the chunks within a WAVE file, with the exception that the Format chunk must precede the Data chunk.

Very Important: Sample Points and Sample Frames

A large part of interpreting WAVE files revolves around the two concepts of sample points and sample frames.
A sample point is a value representing a sample of a sound at a given moment in time. For waveforms with greater than 8-bit resolution, each sample point is stored as a linear, 2's-complement value which may be from 9 to 32 bits wide (as determined by the wBitsPerSample field in the Format Chunk, assuming PCM format (uncompressed). For example, each sample point of a 16-bit waveform would be a 16-bit word (ie, two 8-bit bytes) where 32767 (0x7FFF) is the highest value and -32768 (0x8000) is the lowest value. For 8-bit (or less) waveforms, each sample point is a linear, unsigned byte where 255 is the highest value and 0 is the lowest value. Obviously, this signed/unsigned sample point discrepancy between 8-bit and larger resolution waveforms was one of those "oops" scenarios where some Microsoft employee decided to change the sign sometime after 8-bit wave files were common but 16-bit wave files hadn't yet appeared. Remember 8 bit sound is unsigned and 16 bit is signed. This is important when building your buffers.

Because most CPU's read and write operations deal with 8-bit bytes, it was decided that a sample point should be rounded up to a size which is a multiple of 8 when stored in a WAVE. This makes the WAVE easier to read into memory. If your ADC produces a sample point from 1 to 8 bits wide, a sample point should be stored in a WAVE as an 8-bit byte (ie, unsigned char). If your ADC produces a sample point from 9 to 16 bits wide, a sample point should be stored in a WAVE as a 16-bit word (ie, signed short). If your ADC produces a sample point from 17 to 24 bits wide, a sample point should be stored in a WAVE as three bytes. If your ADC produces a sample point from 25 to 32 bits wide, a sample point should be stored in a WAVE as a 32-bit doubleword (ie, signed long). Etc.

Furthermore, the data bits should be left-justified, with any remaining (ie, pad) bits zeroed. For example, consider the case of a 12-bit sample point. It has 12 bits, so the sample point must be saved as a 16-bit word. Those 12 bits should be left-justified so that they become bits 4 to 15 inclusive, and bits 0 to 3 should be set to zero. Shown below is how a 12-bit sample point with a value of binary [1010 00010111] is formatted left-justified as a 16-bit word.


 ___ ___ ___ ___ ___ ___ ___ ___ ___ ___ ___ ___ ___ ___ ___ ___
|   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
| 1   0   1   0   0   0   0   1   0   1   1   1   0   0   0   0 |
|___|___|___|___|___|___|___|___|___|___|___|___|___|___|___|___|
 <---------------------------------------------> <------------->
    12 bit sample point is left justified          rightmost
                                                  4 bits are
                                                  zero padded

But note that, because the WAVE format uses Intel little endian byte order, the LSB is stored first in the wave file as so:

 ___ ___ ___ ___ ___ ___ ___ ___    ___ ___ ___ ___ ___ ___ ___ ___
|   |   |   |   |   |   |   |   |  |   |   |   |   |   |   |   |   |
| 0   1   1   1   0   0   0   0 |  | 1   0   1   0   0   0   0   1 |
|___|___|___|___|___|___|___|___|  |___|___|___|___|___|___|___|___|
 <-------------> <------------->    <----------------------------->
   bits 0 to 3     4 pad bits                 bits 4 to 11


For multichannel sounds (for example, a stereo waveform), single sample points from each channel are interleaved. For example, assume a stereo (ie, 2 channel) waveform. Instead of storing all of the sample points for the left channel first, and then storing all of the sample points for the right channel next, you "mix" the two channels' sample points together. You would store the first sample point of the left channel. Next, you would store the first sample point of the right channel. Next, you would store the second sample point of the left channel. Next, you would store the second sample point of the right channel, and so on, alternating between storing the next sample point of each channel. This is what is meant by interleaved data; you store the next sample point of each of the channels in turn, so that the sample points that are meant to be "played" (ie, sent to a DAC) simultaneously are stored contiguously.

The sample points that are meant to be "played" (ie, sent to a DAC) simultaneously are collectively called a sample frame. In the example of our stereo waveform, every two sample points makes up another sample frame. This is illustrated below for that stereo example.


  sample       sample              sample
  frame 0      frame 1             frame N
 _____ _____ _____ _____         _____ _____
| ch1 | ch2 | ch1 | ch2 | . . . | ch1 | ch2 |
|_____|_____|_____|_____|       |_____|_____|
 _____
|     | = one sample point
|_____|

For a monophonic waveform, a sample frame is merely a single sample point (ie, there's nothing to interleave). For multichannel waveforms, you should follow the conventions shown below for which order to store channels within the sample frame. (ie, Below, a single sample frame is displayed for each example of a multichannel waveform).

  channels       1         2
             _________ _________
            | left    | right   |
  stereo    |         |         |
            |_________|_________|


                 1         2         3
             _________ _________ _________
            | left    | right   | center  |
  3 channel |         |         |         |
            |_________|_________|_________|

The sample points within a sample frame are packed together; there are no unused bytes between them. Likewise, the sample frames are packed together with no pad bytes.

For information about Wave structures and functions go:
http://www.undu.com/DN970901/00000022.htm (proposed by ZifNab)
http://www.geocities.com/Yosemite/6037/lowaud.html (at my site, it's actually for C, but the function prototypes and structures are axactly the same in Delphi, onlythe  syntax changes)

Voila, I hope I could help you.
Trillo.

0
 
LVL 2

Expert Comment

by:trillo
Comment Utility
Now I've posted my answer I see thet the font is different, so if you want to see the diagrams properly, copy and paste the text on NotePad.
0
 

Author Comment

by:technologist
Comment Utility
I'll have a look at it later. Thanks in advance!

If it works, I'll give you the highest grade (of course)

Thanks once again!

// Victor
0

Featured Post

Enabling OSINT in Activity Based Intelligence

Activity based intelligence (ABI) requires access to all available sources of data. Recorded Future allows analysts to observe structured data on the open, deep, and dark web.

Join & Write a Comment

Introduction The parallel port is a very commonly known port, it was widely used to connect a printer to the PC, if you look at the back of your computer, for those who don't have newer computers, there will be a port with 25 pins and a small print…
In this tutorial I will show you how to use the Windows Speech API in Delphi. I will only cover basic functions such as text to speech and controlling the speed of the speech. SAPI Installation First you need to install the SAPI type library, th…
In this seventh video of the Xpdf series, we discuss and demonstrate the PDFfonts utility, which lists all the fonts used in a PDF file. It does this via a command line interface, making it suitable for use in programs, scripts, batch files — any pl…
This demo shows you how to set up the containerized NetScaler CPX with NetScaler Management and Analytics System in a non-routable Mesos/Marathon environment for use with Micro-Services applications.

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

13 Experts available now in Live!

Get 1:1 Help Now