How does storing data using allocation directives compare to storing data using the system stack on a machine language level? I know that data directives "hard code" initialized data in the hex file's ".data" segment , is the same true for stack variables and the program's ".stack" segment ( the assembly language book I'm reading out of is 16 - bit DOS assembler). And how does the performance of both types of variables compare ( are accesses to one type faster than the other , ect. )? Thanks for any information! :bg