Monday, 15 October 2018

C++: To Reference or Not

Constant Something Reference or Something constant reference... I ask myself this in a similar manner as Prince Hamlet at the start of the nunnery scene... For as a C++ programmer and English speaker I've always found myself most comfortable using the phraseology thus:

const int X (0);

And to pass this to a function:

void foo (const int& p_X);

I find this most useful, we're passing a constant integer reference to the function... However, I was recently challenged that this meant something else to a reader, her input was that it meant "constant integer" reference, that is we would ONLY be able to pass "const int" instances, not "int" instances.   The thinking being:

const int X(0);
void foo(const int& p_X);


Would compile, whilst:

int Y(42)
void bar (const int& p_Y);

bar (Y);

Would fail to compile or at least spit out a warning because "int" was not able to be passed into the function as "constant integer" and "integer" to this reader were different types.

They're not really of course, constant is a decorator (and one which we can remove with const_cast) the aim of using "const int reference" as the type is not all of our purpose in declaring the function like so, the purpose is to communicate what we do with the value going into the function.

It is 100% not about stopping Y being passable into the function as above.

No, we want to tell any user of our code that within the functions "foo" and "bar" we do not change the value, if they pass Y to us with 42, when the function is complete Y will stil contain 42.  If the value could potentially change we would not be able to use "const" that's the purpose of passing const in this case.

Passing by reference is just to save us the time, effort and delay in allocating memory and taking a copy of the parameter before moving into the body of the function, so:

void bar (const int p_Y)

Would be the same operation as above, we tell the user we don't change the value of the parameter, but we do because we take a copy of the value being passed in and operate upon it.

The communication we get with this is very useful.

But of course, if we're using threaded programming and we pass a reference to a value at time point A, then sometime later elsewhere edit the referenced value, we may run into unexpected behaviour, so there is an argument sometimes to take a copy at the point of instantiating the child function, in the most part however, passing by reference is considered the norm.

Thursday, 11 October 2018

When to Dark and When to Light

As a programmer I spend a lot of my time either looking at a screen, using a keyboard or both.  As such over the years I've gotten quite particular about both, I like a nice crisp screen (or more than one to be honest) and I like my keyboard to be mechanical and be one I enjoy working with, rubber domed plastic crap from hell... For eight to ten hours a day, no thank you.

The screens, today it's easier to get nice screens, so I focus more on what's on them, which will generally be my IDE or text editor of choice on the Operating System I happen to be on at the time.

And I've come to loath the themes available in some IDE's but only in very specific circumstances...

Lets take the majority one of my time, Microsofts Visual Studio, it comes with a dark and light theme, I like both... I LIKE BOTH... But I like them in different circumstances.  The Dark theme I like when I'm in my own dark room, when it's late at night, when I'm not interacting with others.  Because (and maybe you get this as well) if I look at brighter text on a dark background the text sears a blot into my retina, not a problem if you're in a dark room, but then in a bright room or an office where you then need to go interact with others you're left with the problem of this blur in your vision... So I keep dark to my own space at my own time, I know when to Dark.

Light themes, I use in the office, I find it doesn't alienate anyone (programmer and non-programmer alike can get into dark text on a light background), and it keeps my ever changing focus in an open plan office from distracting too badly a vibrant and colourful display, I know when to Light.

The problem?  Well, seems many people don't, I've met a programmer recently whom steadfastly would not look at my work on my machine as it was "too bright"... Like he was allergic to LED light or something.  Literally telling me he couldn't read the code... Not because of formatting, not because of a lack of understanding or bad coding standards, but because I was in a light theme.

Folks, stop now and learn when to Dark and when to Light.  Please.

Welcome Canadian Viewers

First time Canada has topped my stats on daily views... Welcome Canadians!

Monday, 8 October 2018

Good Riddance Google+

When I was forced (through my YouTube account) to create an interlinked Google+ account I was not best pleased.  And I posted as such on the pages there, then never returned to them.  As a social media platform it was flawed and ran into the "Google being evil" motif far too strongly.

Anyway, I look through BBC News just now and find this....

The fact Google knew about the issue in March and didn't disclose it... Well it's pretty much just the dregs.

Wednesday, 3 October 2018

I don't really go in for political commentary, unfortunately what's caught my eye is about a bunch of folks at Uni whom are members of the Conservative club... Whatever, there you go... That's not the problem, nor is the fact that they've seemingly gone in for body art...

No the point I want to make is... There's one female in the picture... She's allowed them, or they've just, drawn messages on her bosom... She seems non-plus about this....

But look over her shoulder... There's a chap in glasses... Just staring at the back of her head, it needs captioning... "hmm, you smell nice"... "mine, my precious"... It really does... I'm more worried about that than the seeming racism in the shot.

And it's an utter disgrace the chap right right in the glasses making the "ok" sign... He's got a Windows symbol on his shirt... Where's the Penguin?  Bloody Racist!

Monday, 1 October 2018

My Social Media Stance

I am not on LinkedIn, there I said it... This seems to be something of a mystery to folks, why are you not on LinkedIn?  You should be on LinkedIn!  I'll send you a link to link up in LinkedIn...

The problem is.... I don't want to me on LinkedIn.  I don't particularly engage with social media; I understand many many people do, I myself personally do not; facebook is extremely minimal for me, Google+ has a message from me saying I won't be using it; just because my YouTube account got forced to create one (and they did that automatically); and I only use twitter to complain to companies about poor service!

Perhaps this is the problem, putting oneself on social media makes you immediately accessible, searchable, findable, but not always in the form nor forum you desire, being pulled from pillar to post.

I've been reading a book by Neem James, essentially about how distracting life is and as a developer, I find the number of distractions I can recount moment on moment amazing... Right now, in an office for programming I hear conversations, phone calls, machines whirring, doors banging, someones mobile ringing... And that's before the things I need to listen out for, like my name being called or members of the team around me needing input on the current project.  This isn't to say I want library like observance of the work space, but it's a lot of chatter, subdivision of an area, hard walls of baffles to stop very noisy equipment disturbing all... they're simple ways to subdivide real-life.

I find all this division of attention can slice into the quality of the work produced.

Social media, doesn't yet seem to contain any solution for the inane chatter you can experience, especially if you have a lot of contacts, spreading ever further as the rule of Six degrees is ever eroded by the web we weave.

Without any way to baffle the sound, stop the noise on Social Media, I've opted not to engage directly with it, save for where the form of the communication reflects me in some way.

I'm not a tweet kind of guy, I'm too verbose and the rich tapestry of life can't be reflected in so few words (plus I hate when tweets start 1/3 - gah).

Then the content of an open social media account, it's a soup of opinion, some good, some bad, both the contributors and readers can choose what they take away from it, or indeed interpret it in different ways.

I suppose the same is true of this very blog, but at least here, I know what I intend and if its misconstrued I only have myself to blame, a facebook post however, could be taken out of context, appear on another timeline or be moved not by a human (who I find quite understandable) but by an algorithm, by something ethereal, something fitting to the best metrics, rather than knowing.

It's a rabbit hole I'm not willing to be driven into, if I chose to pop down so be it, but I won't be forced and I'm yet to see a killer need for it.

As I'm a strong believer in face to face, or direct contact, be that phone, email, text, face to face.  A limited tweet, not so much (I can be very verbose).

Thursday, 27 September 2018

Return to Uni

Tonight, I return to university.  No I'm not starting any course, I'm going to the first meeting and greet as an Alumni Fellow.  To which I've signed myself up and been accepted, I don't think this lets me add any letters after my name, but it's nice to drop into conversation that you've a Fellow of your old university.

This will be the first time I've returned to the Newton Building in the Nottingham City Center since it's massive renovations over the last decade and I'm excited to see what they've done with the place whilst also feeling a little trepidations as to what they've done with the place.

For University seems to be a very different business to the one I left when I graduated.  I was the very last year where public entrants to University had their fees paid, I left university with a degree and zero debt, which is practically unheard of then and even more so now.

I therefore look back with memories of the technology, the modules and the learning pathway I was upon and I see where it led me, but also what was worthwhile and what not.

Worthwhile was my programming techniques, some of the data modules and parts of the advanced technology subjects, like Artificial Intelligence.  Things not of use, the Business models, the presentation or self-worth items and the database modules.

The latter were pretty useless as they taught nothing about scaling, nothing about what was then the emerging field of cloud computing, you were on person on one machine with one very old copy of DBase or FoxPro or evil of evils Microsoft Access.  None of it carried over into the placement I was on whilst still on my course nor into modern use of data models and techniques.

The business models were not that much use, because simply the dot com bubble had just burst, the bottom had fallen out of the high priced computing model and so as I hit the work force at the turn of the millennium I was one of many voices crying out for the same jobs, which could be filled by a salary dictated graduate position or Jimmy Bloggs who's just read about it in a "Learn Computers in 24 hours book" but cost the company 1/4 the price.

There in lay the first challenge, the cost of being a graduate, you companies were seeing computers and computing power as an ever cheaper resource, Moores Law was still in effect at the time, the cost of the machines you were driving was ever decreasing, yet you were demanding the same or higher salary.  It made no-sense to business middle or low-runners.  So the highly skilled had to find highly niche or skilled jobs rather than getting in on the ground floor of IT and computing businesses.

Fast-forward into those roles however, and the price to reward difference of a graduates skills became obvious to the employer, they had initiative, skills to tackle anything thrown at them, and the confidence in their own skills to exercise and even expand on the problems and tasks set them.  Where as I observed first hand lesser experienced hands go down with the proverbial suicide project*.

Graduating with no debt however let me leverage lesser paid roles, to drive myself out into a web of wide businesses, from underwear manufacturing, time and attendance systems, web enabled pressure monitoring systems, remote SMS data collection, entertainment system engineering and most recently international development of pay to play entertainment products (fruit machines).

I think today, nearly twenty years on from graduation, I have a larger experience base in the development field than most peers I meet around me and critically as I've always diversified my thinking and skills I've kept myself interested in technology (this blog is evidence of some of the crack-pot things I get on with).

So tonight, I return to University to meet and mix with peer Alumni, lets see where this goes.

* This is my phrase of the month by the way, in a few months time I maybe able to explain why.