Exploded? That sounds weird. Is it possible that you have unwelded you polygons by accident? So every poly is separate? In Lightwave you have to do this before UV mapping, but afterwards you need to weld it together again…
Also, if you want to use displacement make sure that your UVs are as continuous as possible, otherwise you may see cracks or “explosions”. Imagine every subdivided polygon is separate in the displacement phase. If your displacement texture flows smoothly from poly to poly, they will form a smooth continuous surface flow. If the texture jumps from poly to poly, they will be detached.
This is an image I did where I actually wanted this to happen:

The main mesh here is the spiral you know from the AoN:studio chainmail examples, more and more displaced with distance by the TLHPro:Wireframe shader that uses random colors on the original polys here. As you can see, the displacement within each main poly is constant, so it sticks together and is smoothly rounded by the MetaNurbs, but between the main polys, there is a hard jump in displacement that detaches the groups from each other. You could also do this with the small subdiv polys, but that wouldn’t look as nice 
So here I wanted this behavior and used a texture with sharp jumps. Normally, you don’t want this to happen so you have to make sure the texture goes smoothly from poly to poly.
Some other things:
Never use .3DS if avoidable. It can only use triangles, so it triangulates your mesh which is bad for MetaNurbs and any further refining. Also it has a polygon limit per object.
Obj is a much better and more mature format.
LWO isn’t that much better IMO. It depends more on how good the obj export of your tool of choice is. Each format has features that another app can support or not. So for instance, most apps export a .mtl file with the .obj that contains material information, but not many apps use that on import…
The cool things in LWOs are endomorphs and layers, but both aren’t really needed, just convenient.
As for the initial mesh resolution: imagine you create a complete, complex character in Zbrush from an initial 6-sided cube. That is impossible to cleanly transfer into a displacement map and also you wouldn’t be able to animate that. 
I think Tarons Neckling is a rather perfect example. It already has some initial detail, you can see where the ear will be, the nose or the lips, and it has enough detail to be animated, but nothing more. So make sure you have the rough detail in the initial mesh for displacement.
There is a point where having less geometry is more in the way and doesn’t give you more speed etc. Game models are often a good reference for very reduced but complete meshes that would transfer nicely into displacements.
Also make sure that parts that have more detail in the displacement (like maybe the face) are also a bit more detailed in the initial mesh as lets say the leg.
The other important things: Always use Precalculated Polygons for Displacement (I think this is even switched on automatically in the latest versions) since Parametric is pretty much useless in real life, although it it the nicer theoretical approach.
If you use initial meshes with little detail, use Subdivision settings like 6-10 to test the result. Watch your memory consumption in the Windows Taskmanager and look how long messiah takes before it starts rendering. Then, if the result is perfect, you can try lower settings to save some memory and time, or if it is poor, raise the settings to something like 15 or 20. My highest setting yet was 40.
I hope this helps 
Cheers!