Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
scons script (from "Aberations in graphics" thread
02-27-2007, 11:30 AM,
scons script (from "Aberations in graphics" thread
thelusiv Wrote:I checked something (r1561) in that sort of fixes this. Actually it's more of a trade-off. I set up an automatically generated header file "definitions.h" that now contains all the #define statements that used to be -D flags for the compiler. Now, fewer files have to be recompiled every time something changes in this file. It still is a lot...but it's not quite all of them. Let me know if you guys like this better than recompiling all the files every time a define changes.

I still find there is too much re-compile because of REVISION. So I did some searching in scons mailing archive. Here is what I've found,
Quote:And here's another, which may be more in line with what you were asking:

env=Environment(CPPVAR = '-flag2', CPPFLAGS = '-flag1 $CPPVAR -flag3'.split())
p = env.Program('foo',
env.Object('foo2.c', CPPVAR = '-flag4'),

Foo2.c will be compiled with -flag4 instead of -flag2.
We can adopt the same method to add a -DREVISION="xxxx" when compiling main.cpp. This should avoid the excessive re-compile due to revision change.
02-28-2007, 02:00 AM,
We could do this, if we restructured our build source method a little. Right now we just get a big list of files and compile them all, with the same set of options. I think the way you describe is ideal...but I wonder how easy it will be to keep up with all the individual defines that need to be done. Not only is there REVISION but many other defines we could optionally add, to only a few files, so that we don't have to recompile the whole project when we enable something like force feedback, binreloc, or other such stuff. Does that make sense?

Forum Jump:

Users browsing this thread: 1 Guest(s)