Oddity in 2.5


#1

or at least it seems to be limited to 2.5, I had reloaded the model in 2.49 and it didn’t have the same effect. As you can see below its give the effect that my wires are too close to each other and have broken through, but they are not. the effect worsesn as I zoom out. Is this a feature? and if so how do I turn it off, I find is distracting.

Or could it be I just have my unlaying mesh too close to the outside layer


#2

Are you sure that’s not supposed to be indicating the direction of the Normal vector?

Is there a vertex on the end of that thing?


#3

I think that this happens in many Blender versions but it is depending on a setting on the View Properties panel (View menu/View Properties…). In the dialog box that appears in the View Camera: settings I believe that it is related to the Clip Start: parameter.

As you may know the Clip Start: parameter allows you to decide what the camera view will include depending on the distance to the camera if you set the number to a value that is to high you cannot zoom in too close to objects to work in detail cause things that are too close to it will disappear (become invisible).

In order to to view things that are closer to the camera you have to give that parameter a much lower value. This works alright but it has a downside which is that the lower you set the value it produces the problem that you mention worst and worst and in the lowest value you see it do this fully.

It could be that in the new Blender you adjusted that parameter to a lower value and in the older Blender you had it set to a higher value and that is why you see this in the new Blender only. This happens when you are in Edit mode as far as I know.

I’m not sure why Blender does this. I found it annoying too and I though that was a bug but I don’t know if this is true, I don’t know if there is any practical purpose to this but I wish that if it was a bug that it was fixed and if it was not that there was a button to toggle it off cause I don’t like it, I just have to deal with it cause I work very close to the camera too for detail.

The effect of this looks very similar in appearance as when you use a Subsurf modifier on an object and adjust the Median Crease W: parameter in the Transform Properties dialog box (Mesh menu/Transform Properties…) but I don’t think that they are related, I just wish that somebody fixed this. Anyway if this is a bug it should be submitted as such.


#4

Generally Pixel is right, but it has less to do with Blender and more to do with the way graphics card is generating image.


#5

Is related to your video card. You need to plan to use a lower view limit (in view properties) to make sure the end limit is as little as possible.

Usually these kind of artifacts are related only to gaming cards. Designer video cards (like the Quadro or the FireGL doesn’t present this kind of limitation as far i’ve seen, but things can always change as drivers change…).

But you’re right, this kind of effect very less noticeable on 2.4x series, but definitively is in 2.4x series also (You need to have a bigger limit to the view settings to make the effect noticeable there). Maybe if you send a bug report the devs can take a look at it.


#6

Are you saying that this is depending on the graphic card that the PC has? Are you saying that it is a problem in general with many graphic cards?

Well I did a test with the 3 very different computers I got at home and this are the results I got:

The oldest one is my sister’s computer which has Windows XP Pro and an Athlon CPU with a Vodoo3 graphic card, it had an ATI 7500 but it got damaged and I replaced it with an older Voodoo3 but in this PC this doesn’t happen at all no matter how much I adjust the Clip Start: parameter.

The second computer is my old PC which has Windows XP Home and it is a Pentium 4 - 3Gz HT with a motherboard that has an integrated ATI 9100 graphics chip and this PC shows the problem and it has the latest drivers that were available for this chip.

The third computer is my newest PC which has Windows Vista Ultimate 64 bit and it is an i7 PC with two Evga 260GTX graphic cards which is a pretty good and much newer PC and in this computer this also happens and I have the latest drivers that were released recently.

So is this a common problem? A common problem with graphic cards or a common problem with Blender? Cause I could say that it is a problem with graphic cards but I have used other 3D software like 3D Studio Max (a few older versions) and GMax and I have used AC3D for quite a while now and also K3D and I haven’t seen these programs show this issue. I have also tried demos of other 3D programs like TurboCAD and Rhinoceros 3D and I have not seen this issue so I suspect that this is a Blender problem more than anything.

Don’t get me wrong I like Blender a lot and I use it regularly now, this is just a smaller inconvenience but it is an inconvenience cause it makes the graphics look inaccurate on screen. At least this is just while editing and not when rendering which is when it counts the most but I believe that this should be fixed.

Have anybody else seen other 3D programs show just this exact issue? I would like to know and maybe other readers of this post would like to know too. So if anybody has had this problem with other 3D software it could be good if they put their input here to see if this is a Blender problem or if this is something else cause my little own personal test or survey is not enough to tell.

Another input I can give you is that I have used a lot of 3D games in PC including some of the newer like Crysis, Cryostasis, Flight Gear, Vehicle Simulator, NeverBall and NeverPutt, Quake 3 Arena, Batman Arkam Asylum, Unreal Tournament III and many, many others and I have never seen this issue in any of them. Several of these games use Direct 3D but others use Open GL so I cannot say that this is an issue with any of those.

Now, game engines are very different to 3D editor engines but they do share a lot in common in the generation of real-time 3D graphics so I would like to know why I have seen this only in Blender and because of that it would be interesting to see from other users if they have had the same issue with other 3D programs.


#7

In my case I can set the Clip Start: parameter to about .10 and it disappears completely but sometimes (quite a few times) I have found myself in the need to zoom in closer to work on detail and I have had to adjust it to less and boink there it was so I just left it adjusted at low so I don’t have to change it all the time so I have to live with it, that is too bad, I wish that it didn’t happen at all. Like the other guy said, it’s kind of annoying.


#8

As I said it is not Blenders fault. Doesn’t matter OpenGL or DirectX this works the same, but games are not very precise and their authors make sure there are no such situations. I have never seen any computer which didn’t had those glitches in viewport but I suppose it might be driver dependent to some extent. The more important thing is actual rendering, and there sometimes can be similar situation, depending on camera options.

I have seen this in many 3D programs but it might be less often if given program sets clip limits dynamically depending on scene geometry.


#9

Are you saying that this is depending on the graphic card that the PC has? Are you saying that it is a problem in general with many graphic cards?

Is a problem with cards since at least the Geforce 3 and all the Radeon series. This kind of issue also is noticed in Autocad when you have certain number of 3D elements and cerain zoom factor and Solid Edge, and in some cases, SketchUP. In all of these are expected since you are using gaming cards that are optimized for games, not for 3D applications. The problem itself is due to the limited ability of gaming cards to deal with floating numbers.

In games, usually all game engines applies many workarounds depending the card you use (And sometimes drivers takes care of workarounds with the game’s producers). That’s why, for example, nowadays you can’t play the PC version of Metal Gear Solid 2 in any Nvidia card, since was designed with ATI cards in mind (Even if the game uses Direct X). Or even a recent event: VLC 1.0 can’t use hardware decoding acceleration from ATI cards right now (But other software can use without problems). But version 1.0.1 can use, only if you install the upcoming catalyst 10.7 driver.

These kind of problems are common nowadays with gaming cards.

But the thing is that we are talking about Blender right??.. well, as i’ve told you before, the problem exists in 2.49 also, but was less obvious. My advice is just to file a bug report at the Blender Tracker at:

http://projects.blender.org/tracker/?atid=498&group_id=9&func=browse

And explain the problem in detail. If they can do something about it, they will do. (you will need to register in the site, to post reports).

And about the Vodoo card… these cards were really well designed. Sad thing that they couldn’t survive.


#10

No I wasn’t that sure that it was a problem with Blender, I thought that it was a possibility and that is why I said that I wanted precisely the input from other people with other 3D software and that is why I said that my own personal survey was not enough to tell. That is why I asked the input from other people.

 That is also why I mentioned that my experience with game engines is that you don't see that but I also mentioned that game engines are different from 3D editors engines (such as 3D Studio Max or Lightwave) because I know that they are optimized in a different  way and I mentioned that as a precaution, so I suspected that that could have been the reason why you don't see this problem in them. But obviously from your input this is the case so you confirmed my fears. 
 
 I have used other 3D software but not to the extent I have used Blender and I have observed some of the clipping of the viewports for sure but I didn't remember the lines over the corners issue, It is possible that back there when I had less 3D experience I didn't work to the level of editing or detail that I can do know and because of that I never got to try to adjust the viewport or camera settings to a degree that would show this exact same issue as I see it know.
 
 That was precisely my problem, that I could say that it was a problem with my graphic card but I have only observed that particular thing in Blender so I again was suspecting Blender at fault but somewhere along the middle of my post I though that I better ask for  other people experience with this cause I had my doubts.
 
 What stargeizer mentions about this happening more with consumer level cards instead of workstation cards does get my attention. I know that there are more limitations in game cards compared to Quadro like graphic cards, it is not the first time I heard of this.
 
 I also heard that workstation class cards also allow things like anti-aliasing while editing in the viewports even though I have been able to do that too with some other 3D programs with my newer graphic cards (I know that it cannot be done in Blender 2.4* but I heard that in 2.5* it can be done - I just haven't tested this in 2.5 yet) but anyway I have heard of some other issues and that is too bad cause those cards are in essence very similar to game cards and many use the same chips but at the software level and in a few other things they give more capabilities to them compared to gaming cards and that is rather sad. 
 
 Well that is the way the market is set anyway I know that it could be filed as a bug report but from what you tell me it seems that it is not going to be solved like this, I don't know, do you still think it is worth it? Cause if this was a problem with Blender it could have been solved more easily but if it is a graphics cards issue I fear that we are going to have it for a long, long time. [:shrug:](http://forums.cgsociety.org/misc.php?do=getsmilies&wysiwyg=1&forumid=0#)
 
 Another interesting thing of him mentioning how well built those Vodoo 3 cards were (maybe he is very right about that cause that one is still working) well it is indeed a pity to see some competitors in the graphic card area not being there anymore. I think that a bit more competition in this area could have benefited the consumer better but that is the way that things are now. I laughed when I saw such an old graphic card not do this and seeing my two much more powerful 260 GTXs having  a problem with this, it's almost unbelievable ja, ja, ja. Don't get me wrong I'm very happy with the overall performance of these two graphic cards, they do have tremendous power and overall they work OK but maybe if 3Dfx was still in the competition, just maybe we could have the same kind of power from them with less problems but unfortunately they lost the cola wars... er ... I mean the graphic card wars.
 
 Kinda silly indeed that the problem didn't show in that card, amazing ja, ja, ja [:D](http://forums.cgsociety.org/misc.php?do=getsmilies&wysiwyg=1&forumid=0#)

#11

Blender 2.49 and older works great with viewport antialiasing (unlike XSI). I use 8x setting and there is just a view percent speed drop.


#12

I have tried it and it has giving me problems in many different computers and not just mine. I also heard the problem from many other people. If i try to force it through the graphic card control panel it works but the interface starts to give me problems an behaving in a weird way.

Is there a trick to get it to work right? Is it that you need a high end quadro like graphic card or something like that or something else?


#13

I had also problems with aa with nVidia card if setting some exotic algorithm for aa. Currently using Radeon on linux and works perfectly if filter set to box. Blender cannot switch to aa on its own so it have to be forced through driver. The only problem is I can’t run two instances of Blender at the same time on one display as it causes artefacts in interface.


#14

I already knew about bypassing the application settings, I know that it’s the only way to give anti-aliasing to Blender and I also use it for many games and applications and it works great cause some older games and applications don’t have anti-aliasing settings.

Also some games and applications have a limited quality setting for anti-aliasing and I guess that they asume that most people can’t use much more anti-aliasing cause their computers can’t handle it.

My older computer graphic chip can’t but my new PC is an i7 with triple channel DDR3 memory and two graphic cards and I can take anti-aliasing all the way to 16Q in most instances and most anything 3D that I have runs as if nothing. So it is pretty cool to be able to bypass application settings in the anti-aliasing department. Some old games like Quake3 for example which I still like a lot run in full HD (with custom settings) with 16Q antialiasing like if anti-aliasing was turned off! All the edges look smoooooooth. This modern PCs are terrific. :slight_smile:

As for the box setting I have two Evga 260 GTX cards in SLI mode and I couldn’t find a setting in the Nvidia control panel that allows me to change the anti-aliasing type to box. What I see is the setting for the amount of anti-aliasing. I also have the anti-aliasing transparency which allows me to change from multi-sampling and super-sampling or off.

There are other settings but not one that changes the type of anti-aliasing algorithm that I know of so that’s too bad.


#15

I can make it work with Blender 2.53 beta but I get a weird problem with selection. If I force the anti-aliasing through the Nvidia panel it works and it works fairly fast with my rig but the selection of vertexes, segments and faces get all screwed up and I get the wrong ones when I select so if I try to select a segment for example another segment gets select instead randomly.

It doesn’t matter where the vertex or segment or the face is, it could be at the other side of the mesh, right beside the one I’m intending to select or nearby. It just selects things randomly, the selection simply goes all crazy in edit mode.

Any ideas from anybody if this is an Nvidia or a Blender 2.53 bug or anything?


#16

I had those problems on Nvidia a view years ago. It is common also in other 3d apps and depends on the way your card is generating antialiasing. There where reports about it working on Maya only with a select versions of driver and the cause was probably the way vertex arrays where implemented.
But It was long time ago so I cant help. I just suggest to look in Blender settings regarding viewport acceleration and maybe in driver settings (no strange filters for aa, only straightforward 2x, 4x, etc.). But it may be impossible or impractical to work this out as I had changed to Ati for a reason.


#17

Just curious, why the need for AA in the viewport? Besides “it looks nicer”.


#18

Because aa and textured view gets you as close to WYSIWG in 3D as possible without rendering. Ever tried to understand wire of complicated mesh without aa.


#19

Working with anti-aliasing in the viewports gives you a closer idea of how the final model will look and anything that goes in that direction is an improvement and a step in the right direction.

Some higher end graphic cards such as the Quadro are being designed today to work this way and many people using CAD are starting to work like this already.

What happened was that in the past the performance of computers together with their graphic cards wasn’t enough to do this cause as soon as you started to draw objects or models that were too high in polygons it started to slow down to a crawl so it was better to leave it off but as graphic cards performance has continued to increase over the years it has come to a point were it is starting to become feasible to use anti-aliasing in real-time in the viewports too and not just in the rendering.

Of course it may happen that one creates too complex a scene that starts to slow down anyway depending on what kind of PC and graphic cards setup you have so ideally software should allow one to turn anti-aliasing of the viewports on and off easily so one could switch it off if the scene one was working on got too complex so one could have both, either quality or speed.

I think that modern 3D software should start to provide more support for this and also to provide an easy way to turn this on and off easily with a keystroke or key combination.

In my case in which I have a PC that has an Intel i7 920 running at 2.66 Ghz with 1300 mhz DDR-3 memory and two 260 GTX graphic cards in SLI mode I have tested Blender with fairly relatively complex scenes and it stills manage a pretty fast and smooth frame rate in the the viewports and right now there are several graphic cards setups out there that are way faster than that already so I think that the current performance level of higher end PCs is already capable of handling this.

In my case the problem is that if I force the anti-aliasing through the graphic card control panel the selection gets all screwed up in edit mode in Blender but if it didn’t do that I would be using it with Blender 2.53 beta already cause my newer PC can handle it with ease and many other people’s PC out there can also handle it already.

So I think that software manufactures should already start to pay attention to this and start to support this better in their programs cause it is a new trend that is starting gain momentum and it has started to be used by a lot of people and many users are already demanding this capability.

I believe that in the near future this is going to become the de facto standard way of editing 3D in the viewports just as it happened with anti-aliasing in 2D vector illustration programs like Xara, Illustrator, CorelDraw and others many years ago once computers had enough performance to handle it.

It just happens that it took longer for this to start to be used in 3D modeling because in 3D modeling the performance demand is just much bigger. In 2D vector illustration it is starting to be accelerated even more now with the use of things like Direct 2D and OpenVG.

I believe that once people start to get used to work with anti-aliasing in the viewports they won’t want to go back to do it the other way just the same as it happened back there with illustration programs. If you ask many illustrators today to use a software that doesn’t support real-time anti-aliasing while they draw they will say to you: Are you kidding me? and I’m one of those.

I believe that the time has come to do a similar transition in 3D modeling and 3D CAD software.


#20

these sentences here each represent the principle of antialiasing. multi-sampling the same concept numerous times in order to achieve a slower but more detailed realization of the overall intended idea.