?
Solved

32-bit application call 16-bit dll on Windows NT

Posted on 1997-11-06
3
Medium Priority
?
333 Views
Last Modified: 2013-12-03
Is it possible for a 32 bit application call a third-party 16-bit dll, using Generic Thunk's WOWCallback16....? I can hardly find any example on it either from MSDN or Web.

If it's not possible, are there any other alternatives? The data exchange between applicaiton and DLL are of huge amount.

I would appreciate any hints or details!
0
Comment
Question by:DonLi
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
3 Comments
 
LVL 23

Expert Comment

by:chensu
ID: 1408246
Generic Thunks allow a 16-bit Windows-based application to load and call a Win32-based DLL on Windows NT and Windows 95. A Win32-based application can load and call a 16-bit DLL on Windows 95 using a thunk compiler.
0
 
LVL 1

Expert Comment

by:Tiutin
ID: 1408247
Windows 95 implements a thunking model called flat thunks. Flat thunks allow 32-bit code to call functions implemented in 16-bit code. They also allow 16-bit code to call functions implemented in 32-bit code. Windows NT does not support flat thunks. Therefore, if you use flat thunks, your application cannot run on Windows NT unless you isolate your thunking code into platform-specific DLLs.

Windows NT uses a different thunking model. Windows NT supports generic thunks, which allow 16-bit code to call functions implemented in 32-bit code. Although Windows 95 supports generic thunks, it does not support the underlying process model used by Windows NT. This means that generic thunking code might not work identically under Windows 95 and Windows NT.

So, there seems to be no answer.
0
 
LVL 3

Accepted Solution

by:
vinniew earned 200 total points
ID: 1408248
You can't load a 16-bit process into a 32-bit process, so no matter what, data has to cross a process boundary.  That's going to cause extra overhead, always.

You could try to write a 16-bit app that uses a global piece of memory to communicate with a 32-bit app.  That's the fastest way that I've seen/used.  Write the 16-bit app to use the .dll and then set global flags when its time to pass data back and forth.

Just don't use pointers (or convert them yourself) and you should be ok.

V

0

Featured Post

Enroll in August's Course of the Month

August's CompTIA IT Fundamentals course includes 19 hours of basic computer principle modules and prepares you for the certification exam. It's free for Premium Members, Team Accounts, and Qualified Experts!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This tutorial is about how to put some of your C++ program's functionality into a standard DLL, and how to make working with the EXE and the DLL simple and seamless.   We'll be using Microsoft Visual Studio 2008 and we will cut out the noise; that i…
This article surveys and compares options for encoding and decoding base64 data.  It includes source code in C++ as well as examples of how to use standard Windows API functions for these tasks. We'll look at the algorithms — how encoding and decodi…
This is Part 3 in a 3-part series on Experts Exchange to discuss error handling in VBA code written for Excel. Part 1 of this series discussed basic error handling code using VBA. http://www.experts-exchange.com/videos/1478/Excel-Error-Handlin…
Monitoring a network: how to monitor network services and why? Michael Kulchisky, MCSE, MCSA, MCP, VTSP, VSP, CCSP outlines the philosophy behind service monitoring and why a handshake validation is critical in network monitoring. Software utilized …

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question