c++ - When does code bloat start having a noticeable effect on performance? -


I want to make big changes to templates in one of my OpenGL projects, primarily for fun and learning experiences. I am planning to look at the size of execution carefully because I do this, to see how much inferior bloat is. Currently, my release build size is around 580 kb when I optimize speed and 440 kb when I realize the size.

Yes, this is a small project, and in fact, the size of my executable 10x is still going to be 5 MB or more, which is much more than today's standards Looks less ... or is it not? Does it bring me to my question whether motion is proportional to the size, or are there some thresholds, lounges and plateaus on the threshold, which should I aim to stay below? (And if so, what are thresholds in particular?)

On most modern processors, Area is going to be more important than size. If you can currently keep a good part of the data in your execution code and your L1 cache, then you are going to see a big win. If you are jumping around, you can force code or data outside the code, and then it will be required shortly.

"Data Oriented Design" helps in my experience with both code and data terrain. You may be interested in it, which does a good job of showing the way to deal with this way that you are good Get data and code area.

(Incidentally, this is a matter of full cache size and area. One of the reasons for "adapt to size" can be "optimized for speed".)

< / Div>

Comments