pphaneuf: (Default)
Something that I have said a number of times is that nowadays, there is almost no reason to pick C over C++ for a new project (one of the few reasons that I know of involve writing execute-in-place code for very small embedded systems, so no, GNOME definitely doesn't qualify!). Worst case, you write exactly the same code you'd have written in C, just avoiding using the new keywords as identifiers, and you then get better warnings (remember, no templates would be involved) and stricter type checking (no more silent casting of void* to pointers to random things! No more setting enums from any random integral junk you happen to have at hand! No more forgetting a header and using a function with the wrong parameters!).

But these slides really put it together, from someone who's generally thought of as neither insane or dumb. Doesn't really have much to do with GCC in particular, other than just the general fact that this is becoming so obvious that even GCC might be making the switch...

Edit: This article by Amit Patel is also pretty good on this subject.

Moving On

May. 28th, 2008 10:52 am
pphaneuf: (Default)
Reg Braithwaite was writing not long ago about how we can be the biggest obstacle to our own growth. It made me realize how I've dropped things that I was once a staunch supporter of.

I was once a Borland Pascal programmer, and I believed that it was better than C or even C++. I believed that the flexibility of runtime typing would win over the static typing of C++ templates, as computers got faster. I belived that RPC were a great idea, and even worked on an RPC system that would work over dial-up connections (because that's what I had back then). I put in a lot of time working on object persistence and databases. I thought that exceptions were fundamentally bad. I believed that threads were bad, and that event-driven was the way to go.

Now, I believe in message-passing and in letting the OS kernel manage concurrency (but I don't necessarily believe in threads, it's just what I happen to need in order to get efficient message-passing inside a concurrent application that lets the kernel do its work). I wonder when that will become wrong? And what is going to become right?

I like to think I had some vision, occasionally. For example, I once worked on an email processing system for FidoNet (thanks to Tom Jennings, a beacon of awesome!), and my friends called me a nutjob when I told them that I was designing the thing so that it was possible to send messages larger than two gigabytes. What I believed was that we'd get fantastic bandwidth someday where messages this large were feasible (we did! but that was an easy call), and that you'd be able to subscribe to television shows for some small sum, where they would send it to you by email and you'd watch it to your convenience. That's never gonna happen, they said! Ha! HTTP (which I think is used in the iTunes Store) uses the very same chunked encoding that I put in my design back then...

Note that in some cases, I was partly right, but the world changed, and what was right became wrong. For example, the 32-bit variant of Borland Pascal, Delphi, is actually a pretty nice language (ask apenwarr!), and while it isn't going to beat C++ in system programming, like I believed it could, it's giving it a really hard time in Windows application programming, and that level of success despite being an almost entirely proprietary platform is quite amazing. Even Microsoft is buckling under the reality that openness is good for language platforms, trying to have as many people from the outside contributing to .NET (another thing to note: C# was mainly designed by some of the Delphi designers). Imagine what could happen if Borland came to its sense and spat out a Delphi GCC front-end (and use it in their products, making it "the real one", not some afterthought)?

I doubt that's going to happen, though. For application development, I think it's more likely that "scripting languages" like Ruby, Python and JavaScript are going to reach up and take this away from insanely annoying compiled languages like C++ (and maybe even Java).

But hey, what do I know? I once thought RPC was going to be the future!
pphaneuf: (Default)
apenwarr: No kidding. Ohh, C++ is so complicated and messy... This is so much easier... Except... Yaaaarrrrghhhhh!

People, if Perl, of all bloody languages/runtimes can do it in a less complicated way (pure reference counting with weak references, deterministic finalization), you're doomed.

Perl. Simpler. Think about that.
pphaneuf: (Angry Tongue)
Related to my previous post, I would like to use MySQL++ as an counter-example: it's "result set" object does not have a "no more rows" method, it simply throws an exception when it is at the end.

See, this is a good example of something that is not exceptional at all.
pphaneuf: (Default)
[livejournal.com profile] wlach wrote an excellent article recently on how to use (and not use!) assertions properly, and it reminded me of some of my reflections on assertions and exceptions (warning: this is mostly written with C++ in mind, which does not have checked exceptions, no matter what you may think).

Read more... )

So, don't go forward and assert, but rather, go forward and throw!
pphaneuf: (Default)
Wow, I'm quite busy these days, haven't been writing (or reading, for that matter!) much...

Mostly, it's to blame on the quest for a place to live in that's going on. I'd like to buy, this time around, so this makes it a couple of notches more complicated than what I'm used to (I've never been an owner, so this is all new to me). The numbers bandied around are making me quite dizzy! Hopefully, we should come out of this with a nice place, but in the meantime, it's time for "let's save up money like crazy for the cash down", so on top of being busy with this stuff, it'll also make me less visible than I usually am (well, uh, it should still be better than the last year!).

In other more geeky news, I think I am succumbing to the coding style of the C++ standard library with regard to naming. For method names, there's more than a few people who are going to think "finally!" (I used to favour a Java-style interCap, like "readUntil", now I tend to prefer "read_until"). This makes a lot of sense, since this is also more common in C and Perl code. But the more controversial part is that the standard library uses all lowercase for class names (it's "unordered_set", not "UnorderedSet"), and I'm getting a crush on those too... Perl, Ruby and Python are using FullyCapitalized style for those, and so are a number of C++ programmers I know, but I'm finding that there is something to be said for adopting the style of the language. I'm also using namespaces and exceptions (mostly in constructors and object-returning methods) more, these days.

So either I'm becoming stylish, or I'm becoming senile. Oh well.

Also, it would seem that the giant jackhammers are following me.
pphaneuf: (Default)
Seems like I'm now a senior something-or-other at Cypra Media, which did cause me a bit of grief. It's a "targeted marketing" company, meaning at the moment that they'll be sending out emails with ads in them to people who, weirdly enough, asked for it.

I would have liked maybe a bit more "completely new and different" maybe, and while they seem open-minded, they're not quite an actual free/open source software company, merely using a lot of it. But C++ and Perl are two of my favourites at the moment (mostly for their ratio of how much I can bend them to my will to how much they suck), and I think I might be in for learning some AJAXy JavaScript hackery in the process, which I've been meaning to do for a while, so that's that. They're Scrum fans there, which is better than being, say, RUP! Still, I'm more of a chaos model type of person, myself. We'll see, I want to try Scrum first, as it doesn't look completely nuts.

So I'll be starting there as soon as tomorrow!
pphaneuf: (Default)
I'm in a weird headspace these days. This is the first time I'm unemployed in, what, ten years? Totalling about four jobless months since I dropped out of school (which was no big loss!), and it was willingly both times, once again to move to Montreal. I'm being a bit of a homebody, which is not that different from my year in France (yes, I'd go to work, but that'd be almost all I'd go out for), but this time I'm not depressed, I'm just, you know, at home. Rather relaxing, I must admit, but better not over-extend this!

I went out photo-walking with [livejournal.com profile] jul3z last Saturday, which was quite nice. I had nearly zero inspiration for photos for a long time now, and coming back to Montreal, I've had it coming back to me, but seems like I was never carrying my camera at the right times (despite carrying it around a good deal, doh!). We went along the Lachine Canal toward the Old Port, and while it was a good time chatting along with her, mid-day sun and my self-imposed restricting myself to my 28mm/f1.8 didn't make for anything great, I feel, but it was nice pushing myself a bit, and I did spot a few places that could make very interesting photos with better light (by the way, [livejournal.com profile] jul3z, here's that chain that's gone missing!).

We then watched Stranger Than Fiction, which I had seen bits and pieces of on the flight from Casablanca to Montreal. To start off, it has a nice casting, with Will Ferrell (but it's not a Will Ferrell movie at all), Maggie Gyllenhaal, Dustin Hoffman and Emma Thompson, and they did great job of giving some texture to the characters without being too blatant about it. For example, tiny details like Hoffman's character pouring himself some coffee at the start of a conversation, then as it ends, pouring back the content of his mug back into the coffee pot just had be imagine the lifecycle of his coffee, and what it must taste like at the end of the day. The subject is also one that I like, that of finding purpose and meaning in life, but it wasn't some overly cheesy grand meaning either, it was just the same kind of "ordinary" meaningfulness that I experience often, of appreciating what you have and being happy.

[livejournal.com profile] gregorama is right, girls in Montreal are a severe whiplash hazard! (hey, the subject says "random item", you were warned!)

The geeky type can find some dynamic language humour (Perl hackers know that there is indeed such a thing!) involving the Visigoths (that, the Perl hackers might not have known, but they probably suspected) here.

I boggle at how much attention to details Apple has sometimes, and how they manage to instil this attitude to their developer community. While I had previously noticed that, for some reason, moving around word by word in the location bar of my browser using the Control and arrow keys worked in a nicer way in Camino than it did in Firefox, where they use a slightly different set of delimiter characters, and they put the cursor in a different place depending on the direction you're going, avoiding the cases where you get just one character off of where you want to go (say, over a dot or a slash). But I just noticed that while double-clicking on a word selects that word, if you hit the Delete key, it deletes the word and the preceding space. Selecting character by character doesn't "discover" that you stopped at word boundaries, though. Oh well, for all I know, it's going to be in Leopard.

I'm probably one of very few people to be excited by what's coming in C++0x, especially as my top peeve about C++ templates seems to be fixed. One of my big use for template-based meta-programming was to detect errors at compile-time, but providing readable compile-time error messages is currently impossible (my error "messages" are often things like "variable YOU_FORGOT_TO_DO_THIS_THING does not exist", surrounded by a huge chunk of useless, unrelated context). Of course, actual lambda (even though Boost has a really neat hack to do it now) and type inference are very nice to have too. The latter will certainly help cut down on the amount of "foo<bar, baz>::const_iterator it = bob.begin()" typing I'll be doing, as it will now just be "auto it = bob.begin()"! My wrists thank the C++ language committee.

It seems one of my ideas has been picked up by some people in Waterloo, in the form of AideRSS. Now, I want an "Edit" button in the Firefox toolbar that would use AtomPub, okay?


May. 12th, 2007 07:45 pm
pphaneuf: (Default)
Last weekend, I got my residency permit turned down, which, to make a long story short, means that we'll be heading back to Canada. Seems like I was misdirected by the Consulat de France in Montreal, and from what I hear, it seems to be something they've done a few times ([livejournal.com profile] azrhey worked in a place here where they hire a lot of foreigners, due to language skills).

So, it looks like I'm going to be looking for a job back in Montreal.

My weapons of choice are C++ and Perl, but being a Unix/Linux hacker, of course, I am not limited to those, they're just the ones I'm most deadly with. I am comfortable with meta-programming (mostly, but not limited to that of C++ templates), continuations/coroutines, closures, multithreading, as well as event-driven state machines. I am quite effective at code refactoring, particularly in strongly typed languages, where I can use the typing system to my advantage.

I am deeply intimate with Unix/Linux, mainly in the area of network programming (sockets, networking protocols, other forms of IPC). On Linux, I am quite familiar with a number of the high-performance APIs. I have a deep knowledge of the HTTP protocol (and some of its derivatives). I have experience writing Apache modules. I know the difference between bandwidth and latency (and wish more people did too). I have some experience with developing distributed software. I have a higher-than-average knowledge of ELF and Mach-O binary formats, particularly of how symbol resolution works. I know a good deal about component software (dynamically loading modules, for example), and ABI stability issues. While I am not a master at it, I have some Linux kernel development experience as well. I know what make is doing, and why.

Finally, I also have some experience doing project and release management, where I feel I did a pretty good job, and would certainly like to do more of it. I am familiar with the free and open source software community, belonging to a number of projects, including some that were part of my work.
pphaneuf: (Default)
We drank cheap sparkling wine the other day. When I say cheap, I mean 0.87 euro for a bottle. That's 1.34 CAD at the current rate.

I am also hacking on a modern C++ implementation of property lists. I am also getting rather hooked by some aspects of Boost. The binder is astonishingly clever and asio looks very promising. I'm also told their boost::function does not use a virtual method (unlike my attempt, WvCallback), which I'll have to look into.

No, there is no relation between the drinking and the hacking. :-P
pphaneuf: (Default)
Something that always bothered me while programming with WvStreams was how everything had to be a stream. Well, I actually, I had demonstrated how it could be avoided, but it was a rather contrived idea where there was a free-floating object, not clearly owned by anyone, but which registered with the close callback of a stream, where it then committed "suicide". Sort of worked, but explaining it to people was a pain, and there was much opportunity to get it wrong...

Hand in hand with this, I was thinking about a next-generation event-driven API, where the lowest level would use XPLC interfaces, so that the various participants could be of different origins and not necessarily have to link with the exact same libraries. The event dispatcher would take an interface pointer to call back into, and I was contemplating whether this should be a strong or a weak reference.

In the WvStreams style, it would be a strong reference, making the fact that an object is waiting for an event extend its lifetime. This can make things rapidly confusing, since this would require an out-of-band method to tell the stream to un-register itself from the event dispatcher, which would then cause it to die. Correction, it could be a strong reference, but could be a weak one just as well, where un-registering might or might not cause it to die. That's a whole lot more maybes than what I'm comfortable with when it comes to a maintainable ownership model (I can usually make sense of what I wrote, but people coming in my code afterward have to read up a lot of code to know what's going on for sure). But having a global object that's closely involved (the event dispatcher) to deal with the ownership is handy...

Another thing that was handy was how the WvIStreamList could help with identifying streams, and help debug problems with them (largely thanks to the work of pzion!). Having as much information about the objects involved is very useful in a complex server, since any number of things could be happening concurrently, and just knowing that it was a crash in the method WvStream::post_select isn't exactly useful, when it is part of just about every object in the system! So, another important attribute is observability and debuggability.

So I had an idea that I think could be an improvement on all accounts. Registering for events is a weak reference, always, but I add a separate "task" concept. Tasks provide a context for a certain processing, and they are a bit like processes in a Unix system, which do not directly have owners other than themselves (they can exit) and the kernel (they can be killed). They provide the strong ownership for the streams that the event dispatcher wouldn't assume anymore. When creating then, they would be able to identify themselves better, being able to list them, and parentage information. Concretely, in C++ terms, the Task class would be what people would derive all the time, instead of deriving WvStreams to make decidedly non-stream things just in order to get the lifetime managed properly. This is similar to my contrived hack, but with observability and a uniform lifetime model thrown in, which was sorely missing.

As a side-note, I think TR1's shared_ptr, weak_ptr and friends are pretty damned sweet. I'm aware this might come off as pretty radical, but I think we should get on with the times and get rid of most raw pointers in C++ programming.

I'll have to go and do some prototyping of this to see if I'm delusional again.
pphaneuf: (Default)
Went to Mulligan's yesterday with the lady [livejournal.com profile] azrhey, hoping to play their quiz. Turns out that it starts at 23:00 (otherwise known as my bloody bed time, these days), and that [livejournal.com profile] azrhey's Black Velvet was all mixed. Oh well, it was otherwise nice, but I'm left to wonder how are people expected to stay up that late on week nights? A nap after dinner and head over there sometimes between 22:00 and 23:00? Hmm, maybe a plan...

While I like it, I find that I like more the atmosphere of the rue Pargaminières, around the Place St-Pierre and such. It reminds me a bit of the terrasses on St-Denis, or of Grumpy's. Which reminds me how much I miss meeting and hanging out with people. Where are my gay friends, those with otherwise odd lifestyles, the novelists, the playwrights, the artists, the hackers?

I have this weird situation where I'd like to hack very much, but when I get in front of my computer, I just flounder. I have a technical post in process on my laptop that's been going on for almost a week (it's not even that long!), and a corresponding bit of C++ that should be quite small, but seems like I just watch television instead.

At least, it's Torchwood, which I find cool. Rather different in flavour from Doctor Who, but it's got the quite hot Captain Jack Harkness as the main character, which I find very interesting. He's flexible.

I found a scheme for varying the music I'm listening to without doing so at the total exclusion of some of my favourites. I simply make a smart playlist that omits the artists listed in the top 10 of my weekly top artists. I might tweak this, but the idea is that I have too much of some bands in my list, so statistically, I end up listening to more of those, while I actually listen to individual tracks pretty evenly.

It's just striking me that I when I bought an album of the Colocs, I was remembering cheerful music, but I picked up their more critically successful final album, from just before Dédé Fortin committed suicide. Not exactly it. :-P

Bleh. Entertain me.

Am I nuts?

Oct. 25th, 2006 03:56 pm
pphaneuf: (Default)
I'm rather offended, having been bitten by this bug. Why is it even allowed to take a reference to a const reference, if it could be a copy?!?

#include <assert.h>

class Blah
  const unsigned int &myfoo;
  Blah(const unsigned int &p):
    myfoo(p) {}
  const unsigned int getfoo()
    return myfoo;

int main()
  int foo(42);
  Blah baz(foo);


  assert(foo == baz.getfoo());

  return 0;

February 2016

7891011 1213


RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 23rd, 2017 10:32 pm
Powered by Dreamwidth Studios