You'd think Blender would run better...


#1

I finally did some upgrades to “ye olde graphics machine.” I upgraded the G41T-M5 to 4Gb RAM and added a Galaxy GT620 2Gb RAM graphics card. The system still takes its chunk of RAM up front, but I’m left with over 3 Gb RAM to actually work with. I’m having difficulties getting it to work in dual monitor mode. In fact XP pro 32 bit says I don’t have that capability and refuses to force detection of the monitor hooked to the motherboard.

Blender performance hasn’t improved a bit. Maybe some of you more seasoned CG’ers could offer some suggestions to solve these dilemmas?


#2

You might be experiencing a bottleneck besides the graphics, for example the CPU. Also why Windows XP? If you’re using Blender you’d be better off with Linux these days than Windows XP.


#3

You should upgrade that 32 bit XP to something from this past decade.


#4

I’ll upgrade to W7 64bit in January. The CPU’s a Celeron E3300; 2.5Ghz. It’s a 64 bit CPU. I was wondering if I may need to change some settings in Blender. I doubt if I’d ever go Linux. I do too much work in Windows to go changing OS’s now. It would mean I’d been learning Windows API for nothing.


#5

Regarding the dual monitors; could you describe how your monitors are connected?


#6

One 19" monitor for the graphics is connected to the Galaxy card svga port. One is connected to the original motherboard onboard svga port. It is the onboard connection that is not being recognized.


#7

Connecting a gfx card will very likely disable the onboard connector, plug both monitors into the new gfx card instead.


#8

It only has one svga port. There is a second port that appears to be for a television. I’ve read in other discussions that the on board port should be able to be re-enabled.


#9

Much easier to use your graphic cards for both monitor you know.

Your card comes with VGA, HDMI and DVI ports. They’re not specific to TV in anyway - computer monitors these days also have those 3 ports.

http://www.gpuzoo.com/GPU-Galaxy/GeForce_GT620_2GB.html

DVI <-> VGA adapter is fairly easy to find too if your monitor really doesn’t have matching ports.


#10

It depends on the motherboard. With some the integrated graphics and the discrete graphics are mutually exclusive because the PCI Express lanes get repurposed for the discrete graphics. Either way I wouldn’t bother with the integrated graphics anymore because they’re slow and eat up more of the already scarce 32-bit memory address space. The best option would be just to get an adapter for the other output on the discrete graphics like Panupat suggested.


#11

Adapter, as in HDMI to VGA? I’m in no running-down-the-street-on-fire rush to get the dual mode working. If Wally World has an adapter cheap, I’m going there tomorrow, but otherwise I’d just wait till my next order to OutletPC. By dual monitors, though, you guys are referring to using one for my work while using the other for some other things? I’m not simply wanting a splitter where I have the same thing on both monitors.

I’m going to guess I’d need to bring up the Blender issues in another forum? It seems there are settings in the software that use particular system drivers, as in GPU or NDIVIA. Does anyone know where those might be right off hand?

As for the monitors, I’m getting the other half a 39" HDTV for Xmas, so I plan to start using the 32" for my graphics monitor then and a standard CRT for the utility stuff. Does that sound like it will be a problem? January is OS upgrade month.


#12

It might be DVI to VGA depending on the model of graphics card. They are common, not sure if Walmart would have one (maybe a big Walmart?) but Radio Shack or Best Buy would. Yes, it would allow you to have different content on each monitor, not just a mirror.

You’re barking up the wrong tree in my opinion. The processor is more likely the bottleneck in this scenario. Blender is already using the graphics card by default since the viewport is OpenGL and the Nvidia drivers and graphics card handle that. You installed the Nvidia drivers, yes?

Televisions make very poor monitors. They mess with the colors, sharpness, and usually overscan. I’d suggest getting a computer monitor instead of a television if the primary purpose is using it with the computer. A monitor with 1920x1080 resolution can be had for around $100.


#13

The particular TV has a PC choice in its menu. $100 doesn’t get much screen real estate that I’ve seen. I suppose trying it to see if it is a solution is the only way I’ll know for certain.

Yes, I installed the NVIDIA drivers that came with the card. It gives me resolutions up to 2400+ x 2000+, but I’d have to strap on a magnifier to read anything at my age.

How is the CPU the bottleneck? I realize a 2 core is not the state-of-the-art processor. I realize XP 32 bit does not allow for the optimum system. The MoBo won’t take a 4 core. It’s a shame we can’t use an extra PCI slot for another processor… Back in the day, a 386 SX had a separate socket for a math coprocessor. Also we could install RISK processors. I have an extra PCI Express (x1) slot and a 32 bit v2.3 PCI slot doing nothing.


#14

Let’s take a step back. What specifically were you hoping the graphics card upgrade would do? Are you trying to improve the performance of the viewport, like when you navigate around the scene and tumble around objects? Or are you trying to improve the performance of the rendering of the final images and videos? Or something else entirely?


#15

Probably what you are calling “tumbling about.” An example is when I get a lot of verts in the object I’m creating, or particle systems, I can grow a beard waiting for the pointer to catch up with my mouse movement. Hair, for example, would likely take me a few hours to get it combed into place.

It’s tough to have a memorable bench mark of performance, though, because I haven’t been using the program for a few months with a load of other works, and now some annoyances, take up my time. It may have improved slightly, if I recall correctly the lag from before, but nothing to write home about. (Oh yeah… I AM home :slight_smile: )


#16

Particle systems and hair are calculated by the processor and then the resulting geometry is handed off to graphics card to draw using OpenGL. That graphics card can handle a million or more particles in realtime (30+ FPS) so that’s probably not the bottleneck you’re experiencing. It sounds to me like the processor is the bottleneck at least regarding particle systems and hair. It could also be swapping to the disk if running out of memory, but that can’t happen anymore with 4GB on Windows XP because all of the addressable memory space is already physical memory.

When you’re working on something open up the task manager and watch the processor activity. You’ll likely see one or both processor cores maxed out when working with particle systems and hair. If you want to improve the performance of CPU bound tasks I’d suggest looking at upgrading the motherboard, processor, and memory. Spending $400 or $500 on those components could provide a substantial boost in performance over the Celeron processor in the system now.


#17

Wait, hang on, its been a while since Ive looked into it, but winxp 32bit has 4 gigs of addressable space. The gfx card will be taking 2 gigs of space, and the os always takes half of the remaining 2 gigs system ram by default, so unless Im missing something, you probably only have 1 gig of ram to work with. Im not surprised in the slightest that your performance sucks.

You need a 64bit OS, its really not optional at this point in time.


#18

If the 32-bit memory address space ceiling was a factor it wouldn’t slow down it would simply crash when the application requests more than the maximum addressable space. I’ve been there. It doesn’t slow down because there’s nothing to swap to (all of the addressable space is already physical memory). I agree completely that a 64-bit operating system is in order (hence the Linux suggestion previously), but it’s not the performance bottleneck described.


#19

K…

The machine I’m typing on in this discussion has 4Gb RAM and a P6100 CPU 2.0 Ghz (laptop) with its on board gfx. It does much better, but it is also 64 bit. I agree about the upgrade to at least W7 64 bit. Linux is not an option with my current coding impetus. Same as learning more about Blender. It’s wise to find something one likes and master it.

It sounds as though I’ve done just about all I can with the other machine for now. I’ll install a larger HDD and the OS by January, but my intent is simply to spend a grand on a whole new system next year sometime. As long as a machine is relatively current, I rarely trash them.

As I mentioned, I have a lot of other work, too. I’m researching dark matter with a variation of an atom interferometer, set as an array matrix. A lot of money goes there and into home improvement in readying for a winter that is likely going to be unpredictable except that it’ll be a doozer either way. I tend to think the system will do what I need for technical renderings in the meantime, but I may have to put the illustrations for my novel on hold till I build a serious gfx machine. I’ll probably report back on how it does for the general work and how that 32" flat screen works, still feel free to suggest any other possible methods to improve on everything if any of you get a eureka moment.

I sincerely appreciate all the technical opinions. The more food for thought I have, the less I’m starving for an answer. Thanx again to all.

Dr. C.


#20

Keep in mind though that some apps cleverly work around the 4 gig limit by using their own harddrive paging systems, eg. photoshop. With potentially only a single gig of workable space, it may be paging out all over the place.