News:

MASM32 SDK Description, downloads and other helpful links
MASM32.com New Forum Link
masmforum WebSite

masm newbie - need help getting started

Started by Q101, January 16, 2006, 05:29:22 PM

Previous topic - Next topic

Q101

Hello,

I know this has got to be a simple problem but it is driving me crazy and I need to get some help...the insturctor I am trying to work with really doesn't know.    :eek

The code was generating a COFF error when I tried to run it with a plain version of masm. (i.e. ML /Zi mycode.asm)

I thought MASM32 was the answer to my *prayer*.  Then, when trying to run it in MASM32, I got the same error. So I next made sure I had installed the SP1 and SP2. ...Now, I get the following error when I try to run it from the command prompt:

'ml' is not recognized as an internal or external command,
operable program or batch file.

When I try to assemble it in QEditor, using the example as my base, it seems like I am getting farther, but still getting errors in the assembly.

The errors start off with:
C:\masm32\asm\hello.asm(4) : error A2085: instruction or register not accepted in current CPU mode

And then continue to its limit with:
\masm32\include\windows.inc(50) : error A2119: language type must be specified


This is the basic code I am trying to run:

title Hello World Program (hello.asm)

; This program displays "Hello, world!"
.model small
.stack 100h
.data
message db "Hello, world!",0dh,0ah,'$'

.code
main proc
    mov ax,@data
    mov ds,ax

    mov ah,9
    mov dx,offset message
    int 21h

    mov ax,4C00h
    int 21h
main endp
end main

Can someone point me in the right direction? ANY suggestions or help are greatly appreciated!

Thanks,

Q

hutch--

The problem is simple, you are trying to build 16 bit DOS code which does not work with the masm32 project as it uses a 32 bit linker. If you want to use a 16 bit linker to buiold 16 bit code, there is one on the forum web site which you would normally rename as link16.exe so it did not conflict with the 32 bit version. You will have to write your own batch files for building 16 bit code as masm32 only directly builds 32 bit code.
Download site for MASM32      New MASM Forum
https://masm32.com          https://masm32.com/board/index.php

raymond

Q101

And strongly suggest to your instructor to give up teaching 16-bit assembly and switch to teaching 32-bit assembly. The 16-bit stuff has become a specialty which you would very rarely need to know in the modern world.

You could compare teaching 16-bit assembly to teaching Latin as a preliminary for an English course. :snooty:  :snooty: :snooty:

Raymond
When you assume something, you risk being wrong half the time
http://www.ray.masmcode.com

Q101

Thank you both for your replies!

I will retry with the 16-bit linker...but still don't understand why it was not working with the masm and files that came with the book...since it should have used the 16-bit versions...?

I agree about the code. The instructor said that they have been after the school to modify it...but I am still worried since they apparently don't know enough to help students with most basic part of getting going!  :boohoo:

Thank you again!  :U

Q

MichaelW

When using a 16-bit linker you need to assemble and link in separate steps. Assuming ML.EXE, ML.ERR, and the 16-bit linker renamed to LINK16.EXE are in the current directory, a minimal batch file could be like this:

ML /c filename.asm
pause
LINK16 filename.obj;
pause


Or one that generates a listing and a map file (with public symbols), and includes CodeView info (in the object file and the exe), like this:

ML /c /Fl /Sa /Zi filename.asm
pause
LINK16 /MAP /CO filename.obj,filename.exe,filename.map;
pause

eschew obfuscation

macleod199

Just to add fuel to this discussion, I came here because we're having problems with MASM 4.0 in the microprocessor course I'm TAing. It crashes when there's too many environment variables defined, which became a problem as more software got installed on the system. Weird. So I've been trying to get a newer version to work with our setup. We need 16-bit because we're using 8086 development boards from back in the day. For real. They're trying to move mostly to newer Motorola boards, but reasons why it's good:

- x86 is still pretty standard, good to know the basic opcodes
- we're trying to teach them about stuff like cache flushes and instructions per clock. try doing that on a modern processor....there's just way too much going on.
- the CPU clock is synchronus with the bus clock. see point above.
- we can hook the bus up to a logic analyzer and they can see the actual waveforms on the bus all good and square. With the degree of near-overclocking going on now, I really doubt the 1s and 0s would be as clear on a pentium board
- Intel used to give you all sorts of manuals and datasheets with their stuff, so we get to expose the students to the wonderful world of digging through these poorly worded things looking for the correct bit codings and whatnot. Which definitely is a required skill with real embedded hardware.

Learning the most 'modern' tech doesn't always teach you the most.

Glad to find such a useful forum, by the way, it's great that this exists. Otherwise we'd be pretty stuck.