Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"multiple platforms" means "ten different types of gcc+POSIX+fork+pthreads"

This. Unfortunate, but some even seem to think C99 support automatically means POSIX and then start complaining a platform isn't compatible if it doesn't handle what they thought was portable code..



Why is it bad to expect a portable operating system interface? Almost all languages have been doing this over the last decade (C, C++, Java, Go, Python, Ruby, PHP, etc...).


I once had the pleasure of writing a game engine that ran on Windows, Xbox, Xbox360, PS2, PS3, GameCube and the Wii. The 3 Microsoft systems shared some very basic Win32-style system API, but the other 4 were completely unique. When I see people talk about "cross platform" code only to find they mean "Linux and OSX!" I chuckle a little.


I can only guess what the portable code for PS3 looked like :).



That was some funny stuff and enlightening. I remember having the PS2 experience myself when studying its architecture: so many individual processors stitched together it was like re-inventing 60-80's computing experience lol. That said, I thought it was cool that they threw in some scratchpad. It's better than cache in many ways and so underutlized in special-purpose, performance designs. Helps with real-time, too, as it's predictable. PS2's mainly just helped with I/O hacks IIRC correctly, though.

Also interesting to contrast your description of PS3 with Naughty Dog comments. Seems like they got a lot more out of them but had to work hard to do it. Curious, there were products that auto-synthesized code for SPU's like RapidMind. I had planned on trying them had I used PS3's for acceleration. Did you or anyone in that industry try those tools to see if they delivered similar performance over hand-coded algorithms in gaming workloads?

http://www.theregister.co.uk/Print/2007/05/08/rapidmind_two/


The big difference between the project I was on vs Naughty Dog was that ND was 100% committed to the PS3 exclusively and eternally. If you know you can't escape then you are motivated to put in the effort to make it as good a fate as possible. The PS3 rewarded effort, but only on a very steep curve...

Meanwhile, my team was multiplatform. That meant most people could hide in the easy spaces on the PC or maybe the 360. We had a small group of Russians and Europeans who enjoyed the challenge of hand-optimizing the SPU. They would not have tolerated synthesized code ;)


That makes sense. The Russian angle, too: prior experience showed thdm to bd good at programming and optimization. One told me it's because access to good hardware was limited for many and so people got the most out of what they had. He said some even coded and debugged algorithms on paper before going to Internet cafes.

I'd probably try to get more in OSS projects but they can be a rowdy bunch. Gotta have manager or leader that can keep the egos and nationalism in check. ;)


you mean half of code compiled with __cdecl, other half __stdcall, some calls as pascal... oh, that's cute..

and 32/64bit..


Because C and C++ never really had one.

UNIX was seen as the C runtime. When the standard came ANSI C adopted what was considered the minimum portable bits one could use in non UNIX systems.

Then came POSIX, which isn't as portable as many think. It also enjoys some of the UB and implementation specific behavior of the C world.

C++ while trying to cater to C developers adopted the same attitude.

Thankfully the C++ committee is changing that with C++17.


It's a bit disappointing C seems to be getting left behind in some areas. It's simpler to compile, which in theory should mean that it's more portable.


No it is not.

http://blog.llvm.org/2011/05/what-every-c-programmer-should-...

http://blog.llvm.org/2011/05/what-every-c-programmer-should-...

http://blog.llvm.org/2011/05/what-every-c-programmer-should-...

Many equate "C == what my compiler does", but just like any standardized technology with holes open to implementers to decide upon, portability isn't as people think.


I read Chris Lattner's posts. There are definitely things the standards don't specify that people take for granted (twos-complement for example). C is still an order of magnitude easier to parse than C++ though (especially modern). In theory, implementing and maintaining a compiler for it for a platform should be simpler.

Implementing an effective compiler is another concern entirely, but I would still argue that there tend to be more incompatibilities between C++ implementations than C.


Yes, parsing is way harder because C++ requires the context aware grammar.

However many C++ incompatibilities are actually caused by C compatibility and the goal of having higher constructs that are copy paste compatible with C but with additional semantics, e.g. structs.

So in both cases, to achieve your portability goal the languages would have to be fully specified.


C and C++ are admittedly different languages, but I would think more than less of the code for compiling both languages should be the same. I personally can't see how one language would impede the other.

As a C/C++ user, I'm honestly still not sure why a number of C features have yet to be formalized in C++ (e.g. restrict).


> Thankfully the C++ committee is changing that with C++17.

Which C++17 feature are you referring to? Modules?


The work being done by the SG

https://isocpp.org/std/status

Filesystems, networking, concurrency, ....

Sadly databases died out apparently.


Looks like they're doing what they should've done some time ago and what helped many competing languages get ahead. Between that and recent standards, C++ programming might get really interesting again. Even I might take a stab at it eventually.


Even though I hardly use it at work (JVM/.NET), it was my to go to language after Turbo Pascal. So I still enjoy following and dabbling with it.

Some of the issues C++ had were:

- C compilers were still catching up with ANSI C89 and it was a mess

- C++ being in the process of becoming standardise was even worse. It was quite hard to find out what were the working common features between compiler vendors, specially in terms of semantics

- The C culture that prevented many nice frameworks to succeed, because they were too high level, hence why MFC is such a thin wrapper over Win32.

- Lack of standard ABI in an age people only shared binary libraries.

C++14 and C++17 look really nice, but I doubt C++ will recover its place at the enterprise, beyond performance critical libraries/modules.


Expecting it isn't bad, blindly assuming POSIX is available everywhere is bad, as in calling your code cross-platform because it uses just C99 and POSIX for instance


The majority of the time it's a safe assumption though, so I'm not sure developers can really be considered to blame.

The reality is, without it, developing software for a platform takes more effort which I think is one of the reasons languages have taken on the initiative of building portable interfaces. Even C11 has added threads.h.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: