Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

interesting. so traditionally the preprocessor has been used to work around incompatibilities across various compilers/platforms and to completely enable/disable features at compile time. what's the new thinking there? can you build big projects across clang/g++/intel without preprocessor workarounds now? has the number of viable compilers in use dropped and compatibility amongst those that live on increased? how about stuff like debugging/instrumentation? or is the current thinking on all that stuff to always build with all of it and enable/disable via runtime branch? (along with some argument that configuration at build time was more trouble than it's worth on modern spacious/fast machines)?


> what's the new thinking there?

In a nutshell: Using compile-time facilities which are within the language itself. Compilers will expose information about the platform, about themselves, etc.

A couple of examples:

* https://en.cppreference.com/w/cpp/utility/source_location

* https://en.cppreference.com/w/cpp/types/endian

> can you build big projects across clang/g++/intel without preprocessor workarounds now?

Well, first of all, s/now/when C++20 is fully and widely adopted.

Still, a good question. Possibly not? I'm not sure. But you would need a lot less of them than previously.

> has the number of viable compilers in use dropped

Not really.

> and compatibility amongst those that live on increased?

Well, if they're compatible with the standard, that says more than it used to.

> how about stuff like debugging/instrumentation?

Maybe if constexpr(in_debug_mode) ? Not sure. I'm not on the committee...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: