Thursday, 30 November 2017

C++ : Trust the STL

One sore lesson to teach some developers is when to trust the compiler, once you've gotten that across you have to start teaching folks to stop re-inventing the wheel.

If someone has already implemented a file handler, or a serial port abstraction, or a wrapper for some obscure feature, you need to evaluate that offering...

To evaluate whether a library is worth using, firstly see if it works, then see how many folks actually use it, the more that use it then the more likely bugs will be flushed out and the whole thing has been tested.

Leveraging this kind of mature code within your projects assists in bootstrapping the startup phase of new projects.

Boost is a note worthy example of what I'm talking about here, many software shops (at least the ones I know) resist using open-source or third party libraries, they prefer to stick to in-house developed niche implementations until the very last moment, this of course slows development and completely symies innovation.

Boost however is one step further than the problem I'm going to tackle today... The Standard Template Library...

The STL is often commented upon negatively, this is despite it being a hugely available resource, vastly and deeply tested throughout and constantly incorporating new innovations.  Whole books have been written on the topic, and yet one can still find projects and individuals resisting using the STL.

STL nay-sayers will quote "no need for an STL requirement", "uses less memory than an STL implementation" or "faster than the STL"...

The problem with this attitude is, are such attitudes going to sufficiently tackle testing of their bespoke solution, is that bespoke solution going to be as robust or as easily maintained as something using the STL?

Probably not, and this is a hard one for die hard "purist" developers to swallow, we want to write all our own code, we want to be gods in our domain, the trouble is for the vast number of us, god has already been there and he wrote a decent enough library to do the task we need doing... So leverage this!

I came across one such niche item the other day, with an algorithm to see if a string starts with...

They hadn't used boost, or the STL, to do the searching, yet perversely had used an std::string... Their code, looked a little like this:

const bool StartsWith(
const std::string& p_Text,
const std::string& p_Pattern)
{
bool l_result(true);

if ( p_Text.length() >= p_Pattern.length() )
{
for (unsigned int i(0);
i < p_Pattern.length();
++i)
{
if ( p_Text[i] != p_Pattern[i] )
{
l_result = false;
break;
}
}
}
else
{
l_result = false;
}

return l_result;
}

It is fairly logical code, they're looking at the length of the presented parameters, to avoid looping when not required, then they only loop from the start and only return a fail when the character is a miss-match, looking at this with programming eyes from 1996, I'd say this is fine.

Looking with eyes well aware of the STL, I cringe a little, and replaced this whole function like this...

const bool StartsWith(
const std::string& p_Text,
const std::string& p_Pattern)
{
return (p_Text.find(p_Pattern) == 0);
}

One line, of very much more maintainable, vastly more readable and easy to comprehend code...

The developer of the original however was not happy... "you're wasting resources, this will find any instance and tell you the input".... he's right it will, but the STL will still be faster than his code.

I demonstrated this by plugging both into CompilerExplorer... He still refused to listen.

Therefore, I've written this little helper project, to run the two functions side by side, threaded three tests, looking for the match, a long match and a negative match at the start of the string (Code on Github).


The results of this are interesting, you see the project itself favours cases where it's highly likely the string being searched for is present and therefore we don't need to worry too much about the odd test not finding a match taking longer... This is exactly the behaviour seen in the STL based find example.


The Short search time, for the same data, on the same processor went from 28358 microseconds to just 5234... That's about 81% faster.  The longer search is more stark, falling from 185966 microseconds to just 6884, just over 96% faster!

The rub is the negative case took longer, rising from 19765 in the hand-crafted search to 25695, just over 30% slower.  Some of this increase can be explained perhaps by the hand-crafted version using the lengths to quickly skip too short an input, otherwise it is simply that the STL find has to iterate over the whole string when no match is found.  A hybrid to not perform the find at all, when there is insufficient data maybe in order; however this may add to our maintenance burden and lower code clarity, swings and roundabouts.

However, clearly in the case of this project, dismissing the STL resulted in slower code, we have a system propensity for matches, they're quite short, and all target platforms have the STL built in, use it.

Never be affraid to ask questions of what you're working with, ever.

Tuesday, 28 November 2017

CMake rather than Mammoth makefile marathons

I'm having difficulty communicating with some folks about the beauty of cmake and using ccmake to leverage that beauty.

These are folks whom are either completely ignorant of what a makefile should look like, are happy to manage their own or at worst case are folks put off of makefiles by having inherited projects which have spiralled out of control with mammoth makefiles and a propensity to being so complex as to prevent any cost-effective entry grade for new developers to key into - i.e. they're too hard to learn, or obfuscated sufficiently to allow established developers to retain their positions of glory and power.

I don't subscribe to that ethos however, and believe that as a leader in development you should facilitate everyone to be being able to do everyone else's development role, be that starting a new project or continuing an old.

It perhaps comes from my being able to work alone and defining a role which others are then keyed into, I have been forced to allow entry to my work, to make the cost of someone else bootstrapping my work into their wetware (brain) as simple as possible.

Mammoth makefile marathons are not the way for me to do that, a CMakeLists.txt file, now that's a better proposition.  However, even here you have to take care, some folks are ignorant of the tools available, to leverage cmake in this way one might use this kind of command line...


This is daunting for a newbie, and even an experienced developer has to admit...

ccmake <PATH_TO_CMAKELISTS>

This is a much more succinct and easy to access way of getting into your CMake way of working, the cost of entry being so low as to actually make introducing new developers to Linux development, or just general CMake usage, trivial.

So where am I failing to communicate this?

Well, cmake, ccmake... The naming conventions of both are so close as to confuse people, they don't hear the second C in CCmake, or they think I'm talking about the C Programming language.  This is a lack of understanding on their part, people being people however, they don't want to admit they've no idea what you're talking about.

(As an aside, folks, if you want to be a good developer, a good person, please admit when you don't know something, it causes so much less issues in at development scrum time if you are handed work, and you simply state "I know nothing about that".  Someone else can be assigned the role, or better still, you can get training and schedule the work more effectively!)

My solution to this difficulty therefore?

Rename the programs, I've created two symbolic links in /usr/bin....

sudo ln -s /usr/local/cmake/bin/ccmake /usr/bin/makefile_prep_gui
sudo ln -s /usr/local/cmake/bin/cmake /usr/bin/makefile_prep_cmd

I've essentially bamboozled the communication factor by giving cmake the working name "makefile_prep", this means that those opposed to ceasing direct use of makefiles still feel empowered, but are subtly diverted to using an automated tool.

Immediately questions and opposite to changing the status quo has ceased, and folks are talking about using the new "makefile_prep" tools.... How clean they are, how nice the builds look, how they integrate with CLion easily, and "the output from my makefile_prep looks exactly like the build going on inside the IDE (CLion)"... Little do they realise they're both cmake!

Oiling the cogs of resistance to change, this is where I'm living at present... Its not an easy task, but sometimes it's rewarding.... Now to do the same in the day-office.

Friday, 17 November 2017

Waking Up With My Dog

Someone was cold, this is where they perched...


Ain't he the cutest....

Wednesday, 15 November 2017

Time Expectations in WoW Classic?

With the news today of EA reducing by 75% the time it will take in Star Wars Battlefront 2 to unlock a hero character, can we expect to see modern gamers head into WoW Classic and start complaining?

(c)2017 Electonic Arts

Lets just recap though, World of Warcraft, vanilla, people played a character to level 60 in about 4 days played, that's about 96 hours played.

EA's plight, and collapse into pandering, happened with a played time of around 40, that's less than half... Reducing it by 75% that now makes a hero in SW:BF2 in 10 hours.

10 hours for Vanilla WoW was not enough, and this is where my first concerns with WoW Classic come about, firstly, will Blizzard be forced to pander down to newer gamers whom most certainly want action & reaction, risk and reward.  They certainly don't deal in patience nor RNG.

My second concern comes that this is a big issue, and perhaps Blizzard will side step it, by simple nerfing the amount of time to level, or perhaps increasing the rate of XP gain, as arguably since Burning Crusade or Wrath of the Litch King, the business model for WOW has been to push players to level cap as fast as possible and explore repetitive tasks in that area.

The way of thinking Blizzard have entertained (no pun intended) since I stopped playing was daily's, and find a group, find a raid, tokens, faster reward for input, but with exponential damage, health and other stats making that power-wonder factor... Fast rewards, I said during my appearance on "Shut Up We're Talking" that this was where things would be going.

Rinse and repeat.  That is all I've seen delivered, it's certainly all I've heard explained.  Should I name drop "Garrisons"?

All this said however, we must remember, that there will be a huge influx of tourists to WoW Classic, toe dabblers, so what could the population of regulars; the hardcore; who settle there expect?  These are the things we don't know and as you can tell by this post, we can only speculate about.

Monday, 13 November 2017

Virgin media - Poor Speed Proof in Statistics

From my prior post, I have now appended the first ResultSet1.csv file - used to generate the chart in my previous post....

However, since than I have been doing some processing, and between 20:00 and 21:59 there are 191 entries in my little chart.  Remember this is what virgin refer to as "Peak time", and they state an average of 50mbits in 24 hours.  I'll be fair here, they never state what the actual speed they throttle down to or limit one as, but the chart here clearly shows 50mbits...


So, what average did I receive for 24 hours?

22.124 mbits/sec

Less than half the speed promised.

My speed during the peak time slot?  Averaged out as?

6.310 mbits/sec

Utterly pathetic...

You can download the results csv yourself, plug this into a spreadsheet and enjoy the proof positive of this dreadful situation.

To make matters that little bit worse, I have tried to call Virgin, and spent tens of minutes of my break and lunch in the queue to speak to someone, and their live chat simply performs this horrid loop:


Clearly the average upload speed of my test matches their predictions, its not particularly impressive, but is the speed as advertised.

Since I am having issues talking to Virgin, their Twitter minions keep passing me from pillar to post, asking the same dead-end and above all unrelated questions, and their actual customer services team are harder to talk to than they should be.... I think I'll be forwarding this one now to the communications ombudsman.

Sunday, 12 November 2017

Virgin Media - Poor Internet Speed Measured

Phase one of my plan is now complete, I have monitored the speed of my internet ALLLL day, from just after I got home in the morning, through until just now.

No-other unit or device was being used through from around 4pm until just now - the wife and I went to watch the new Thor film at the cinema - so, how can Virgin Media explain that clear throttling down mid-afternoon...

It is awful and far below anything listed on their website, utter and total garbage speed.  Yet we see a general level of 25+ for most of the day

You will need to click this chart to see the image close up.

My data points were taken with my script - see the previous post - every 10 seconds, it has generally used resolved to the Server in Leeds.

The time markers (green) were added by myself, artificially, by eye.  But they give you the gist of the time, and I will also upload the raw CSV somewhere...

I am now going to hit twitter.



Edit - This is fabulous, I've just gone to the link I was provided to send info to Virgin, and this is what happens....

That was the page I wanted.... And the same message appears if one tries to see the local service status!

This really is getting criminal, no-one would be able to run a business on this kind of service, no-one could perform research or educational activities, and I can't work in my technology areas of interest at all... And complaining has resulted in nothing but indifference and stagnation.

Friday, 10 November 2017

Virgin Media : Poor Internet Speed Misery

You know that moment in Misery where Annie (played by the excellent Kathy Bates) raises the lump hammer to Paul (James Caan's) ankles?  That hopeless moment, where you know what's coming, and she's determined this is the best, and he's helpless to change things....


Yeah, his feeling at that moment is the same feeling I get whenever I try to solve my service problems with Virgin Media.  I've tried in the phone centre, they either won't talk to me, or deny I'm an account holder - the account being in my wifes name, but I'm a registered up user of the account etc etc.... Or they simply deny there's an issue....

"I can ping you now sir"....

Really, a few ICMP packets get through and you think it's a-okay do you?

Or I get told, reboot your superhub...

Or variously asked "are you on wifi or wired"... It makes no difference when the speed recorded by either is less than 2mbits!!!

And I've just been told in a reply on twitter "If you have been told about an Area Issue were you given an estimate as to when the issue will be resolved"... I've not been told anything about any area issues, nothing, nada, zip.

Therefore I'm still not best pleased, remember I went down from paying through the nose for Vivid200, as I never ever got anywhere near 200mbits/sec ever.  I did record regular speeds of around 34 to 50 mbits, therefore the safer, more cost effective option was to pay for Vivid50.  Simple, see, simple logical option, if they can't meet the expectations of their own service, play the system at it's own game.

However, it seems that in reality, Vivid200 should be labelled Vivid50, and Vivid50 itself should be called Vivid1... Because breaking the 1mbit/sec barrier seems to be too much for it.

I have therefore decided to create something anyone can interpret, a chart... Managers love charts... People can interpret charts....

This chart is going to record the internet speed (recorded from my linux server, on my wired Cat5e directly to my 1gigabit router, which is directly wired to the Virgin Media Superhub 2 set into Modem Mode).  No wifi, no confusion, no bull, and my router has pfSense, so I can see there's no shenanigans, just the pure speed through put.

I'm going to record the speed with "speedtest-cli", which you can see yourself how to install here.

I will collect my results by running a python script, which runs the speed test and outputs the time, the upload and finally the download speed into a CSV file.

Find the source on my github... 

import subprocess
import time
from time import gmtime, strftime

# Open a simple text file for writing the result
resultFile = open("speedtest.txt", "a+")

while True:
# Header text & placeholders for our result
print ("Starting Test...")
timeStr = strftime("%Y-%m-%d %H:%M:%S", gmtime())
downloadSpeed = ""
uploadSpeed = ""

# Action the process to test our speed
# capturing it's output
result = subprocess.run(['speedtest-cli'], stdout=subprocess.PIPE)

# Process the output into text & split the text
# at each new line character
btext = result.stdout
text = btext.decode('ascii')
lines = text.split("\n")

# For each line, check whether it is upload
# or download
for line in lines:
# For Download, take a split against space
# and the middle value is the speed
if line.startswith('Download: '):
speedParts = line.split(" ")
if len(speedParts) == 3:
downloadSpeed = speedParts[1]
# Likewise for upload, the middle  value is
# the tested speed
elif line.startswith('Upload: '):
speedParts = line.split(" ")
if len(speedParts) == 3:
uploadSpeed = speedParts[1]

# Print our output result as a CSV
print (timeStr + "," + downloadSpeed + "," + uploadSpeed)

# Write the result to a file also
resultFile.write (timeStr + "," + downloadSpeed + "," + uploadSpeed + "\r\n")
resultFile.flush()

# Count down until the next test time
count = 10
while count > 0:
# The line is repeated, so we use the end=""
# and a return carriage to print over and over
print ("\rTime until next test " + str(count) + " seconds", end="")
time.sleep(1)
count = count - 1
# Print a new line to stop the next text appending
# on the time count down line
print()

I will then load this CSV file into a spread sheet and create a chart, here's one I created earlier with 5 test data points.


The blue-line is where I'm most concerned, that is my download speed, as you can see within three minutes I had quite a difference, ranging from a high of 2.24 mbit, to a low of 1.23 mbit.  Upload speed has been more consistent giving a measly 3.5 to 4.0 ish.

I already know where VirginMedia will take the conversation, they will talk about "based on average peak time download performance".  However, I want to immediately counter that their speed information states that speeds are based around "Movie based on 4.1GB file size a single user and wired connection", and this chart is provided.... 

Average download speed at peak time (8pm to 10pm) the time's I have mostly messaged to them on twitter are sub 1mbps... Right now at just before 11am they are still reporting as extremely low.  And yes, this server is the ONLY machine on in the house, the wifi is off, the other wire into this hub removed, there is one wire to one machine and one wire to their Superhub...

And yes, I can get 1gbit disk to disk over NFS on this hub, the wires to and from it to the machines are perfect, and I've also swapped the wire to the superhub.

I'm going to run this for a few days, and see what speeds we get in the dead of night, or early mornings, and see if there is a pattern.  I have known for years Virgin will throttle speeds, however, their table of speeds is labelled "Average", one can only believe we're on the lowest ebb of that bell curve, and I am not a happy customer.

Bon Jour

I had a spike of viewers over the previous week, and they put a certain country on the map of my analytics for the first time.... France!


Wednesday, 8 November 2017

Software Development - All Areas Stagnation

It has been said by far bigger and better minds than myself, that if you sit still, if you don't continue to learn about new things and innovate you will stagnate.  This has been a huge problem looming within the business I work, certain things have worked since the industry sector was conceived and even though more than half a century has passed it has largely passed the internals of this industry by.

That is until very recently, where market competition has sprung up, the market base itself has reduced and so pressure is on... Nowhere is this more apparent in my industry than on the software, the front-line of pushing product to customers.

The trouble however seems to be that many people have stagnated, they've stuck with the safe option, the tools which work off of the shelf, I am of course talking about Windows, the entire tool chain that is used by 99% of the company is all Windows based, I am the man on the spot waving the Linux flag.

But just a few days ago, the Windows world had to come to my desk and see their future, I had to show technically minded folks around the code of the new system, introduce them to my imposed coding standard and update them from Microsoft Specific Visual C++ thinking to thinking about platform independent Standard C++ code... I had my work cut out for me.  I prepared the cleanest desktop environment I could (i3 on Ubuntu).


I didn't want to startle them, so the editor/environment is Visual Code from Microsoft... They started to look at the system, it's structure, how the code related to the design and the diagrams they already had, we started to follow the process flow diagrams.

It was a success, certainly no-one burst into tears, they saw the kin-ship between this code on Linux and the systems they'd worked with for decades on Windows.

But then, the senior software manager leaned down, peering at the screen, and he said some fateful words...

"I've never seen that before".

Is he talking about some piece of C++14 or C++17, the lambda's, the auto's, the shared_ptr... What technical bolt has he not screwed his nut around?


"That's very good, you can see the whole code layout.  I've never seen that before, who did you say wrote this tool?... Really Microsoft, I've never ever seen that before".

This chap uses Sublime, I've seen him using Sublime... Which does exactly the same thing....


What is the lesson to be learned? When we're talking about stagnation in software we are not only talking about the language, but also the tools, and then not only the IDE, the whole environment.

Certainly I was introducing Windows users to Linux, and even then on an unusual minimalist desktop manager, but still the lack of connection between a tool I've seen people already using and what it was capable of demonstrated tools are not being leveraged to their full potential... Certainly learn your new languages, learn your language updates, but keep your tools and environment up to spec too...

Monday, 6 November 2017

World of Warcraft - Classic

Its happened, the announcement with not quite enough information, but it did happen and the WoW orientated internet has slightly melted over it, everyone and their dog on YouTube have posted opinion pieces on whether this "in development" service will fullfil their desires.  I have no opinion in that, instead I'm going to talk about my Warcraft Experience...

I first met the world that is Warcraft with "Warcraft II", which I remember my brother and I bought whilst on a trip into Nottingham, we bought it from GAME on Lister Gate, and talked about it incessantly as our parents made us trudge around Marks & Spencers, rather then rush home to play immediately.

We played this on our Intel 80486-SX2-50Mhz, with 4MB of RAM and a 128K S8 video card - VGA baby!

I don't remember finishing the game, I have to be honest, I don't remember it that much... However, I still have the game, the exact box we bought is proudly on the shelf, and has been visible behind me in most of my YouTube videos.  And in 2000 I returned to play the game in full, as I had played AOE2 to death, was vastly disappointed with C&C3 Tiberium Sun and so wanted to return to Azeroth.

Fast forward to the release of WoW, and a friend (Hi Paul) came to stay for the weekend, and he brought this game with him... And I walked under the trees of Elwynn Forest for the first time, killing pigs with his level 6 Dwarf Paladin... And I loved it.

The moment he went home on that Sunday evening, I went to the Asda superstore in Long Eaton and I bought the game, the original game, with the first edition of the manual... Many years later I threw this away, when I moved house, I still kick myself for that, and remember distinctly looking at it all in the bottom of the empty dustbin...

But at that moment this was brand new, and I set up my account and went to it... I didn't understand my friend would be on a completely different server to me, so I just allowed the game to pick any server it liked.  I wanted to play and adventure in the game, so I picked the what I thought of as the "single player" option... PVE.  I did not understand what an MMORPG was.

Into PVE I stepped, Allience, Human, Warrior... And there I was, I remember I played three evenings around Northshire Abbey alone... Just three evenings, about 2-3 hours, maybe 6 hours played total, just to finish the Northshire Abbey quests and then I was sent by this NPC to "Goldshire"...

There were no loading screens, I thought I had moved from one zone to another when I left Northshire, I saw the section of map discover, and I thought this is huge - whilst only looking at Elwynn on the map.... And then I right clicked..... And the map stepped out... These slabs of undiscovered map... Darkshire to the south, Westwall to the West, Lakeshire... OMG this is a continent.

And then I right clicked again.... Two continents, two whole continents... I was a very small cog in a very large world.

The the server crashed, this was a three-four days after release.

When everything came back up, I set about questing, and I overwhelmingly remember it taking so long, like a week into my experience I was wearing all white gear from quests, and was eyeing up an Axe - purchased from an NPC - for 10 silver.  Which was a hell of a lot of money.

I remember playing with a friend who had rolled a Paladin, when he sat down to drink, I sat down to drink, I didn't understand what drinking did, I didn't read the tool tips, I was so naive.

A week further on, and my hours sunk into the game expanded and expanded, I was sleeping less and playing more, I used to drive home from work at 12noon, play 30 minutes, not eat, then drive back to work... Where I would leave the second it ticked past time, to get home and play again.

I soon had more friends in the game with me, and I helped form a guild... "Arx"... Named for the Latin word for fortification or castle.  I designed the natty guild tabbard in dark royal blue, a white tower on the chest and suitably lush borders... And we started to level.

I hit 40, and thanks to my side-line of mining - selling stacks of copper bars for 10 silver in general or by mail - as I didn't know where the auction house was - I'd not been to Ironforge (the only place there was an Auction house at launch!), but I hit 40 dirty poor...

I set about that evening going further afield, I travelled to Stormwind and then through to Ironforge, and I found more and more quests, I found the auction house, and saw I could sell the copper stacks I had for 35 silver each, 50 silver as bars... So I did, and made money for my mount so much more quickly.  Back to Eastvale I went to train and buy, a brown mare whom I always referred to as "Nelly".

And I travelled the world, meeting new people... It happened in the swamps outside Theramore, I met a night-elf.  The first I'd ever seen, she was a hunter, level 32 ish, and running... I was mounted, I didn't know any different, this was a female character... I spoke to her, and she spoke to me, we quested together... She joined my guild... THANK GOD she actually was a women... (Hi Sue).

I fancied the pants off of this woman, and rightly so, I saw a picture a while later (grrr baby, grr), I made the mistake of that same assumption before... Not so much fancying, but wondering... "Is that actually a dude?".... (Hi Nick, aka Eve - lol - yes we wondered for a long time mate, before you joined vent!).

The trouble then started really, you see I was a warrior, and the guild main tank, and I was struggling and pushing myself to level and gear up... The first little niggle was my co-guild leader (Hi Chaplain) was also a warrior, so there was a little trouble there that what I had, he had, what he needed, I needed.  This was solved later as he rerolled to a rogue, but we're talking very early vanilla, and having two warriors was a pain.

We also vastly lacked casters, we had a lovely priest (Hi NW), but he would never come into vent, or certainly never spoke, making it hard to dungeon - however, he was a world class top notch priest.  We had the lovely Sue as her Hunter, a Paladin (aka Nick), Chaplain in his Warrior or Rogue.  And then a rotation of different team members, we had a Druid (Hi Hlaalu), we had a warlock (whos name I forget) and that was about it... For a long time.

A friend from work rolled a Dwarf paladin I think, and he got very very drunk and that caused a fraction between WoW in my down time and my work life, so he left the server to play elsewhere.  We then had a few other paladins come and go, and then another female warrior - but she often wondered off to play Horde... True Blue represent, we ain't dealing with no half-Hordies here.

Mage and Warlock were always missing from our group, and it showed.

But we did get Strat and Scholo on farm, meager progress, but we had it on farm, we could go there over and over.  The next chapter of drama was about to burst all over me, a few of the folks in the team were not really grasping the idea of progress, about gearing up, some folks would go spend two or three hours skinning or picking herbs then go sell them, or skill up, and wonder why they were still taking a load of damage - inspect, because you're still wearing level 40 gear, you're level 58.

Another person took an aeon to get from level 57 to 60, and when they finally did break the level cap, they were too busy farming gold for an epic tiger mount... To be frank it all got slightly derailed, but I had pretty nice gear, and kept increasing it by going PVP Battleground farming... Knight Captain, thank you very much, gave me boots and leggings and some other bits to augment my gear, sadly though I lost the plot here, I fell out with Sue... I fell out with a bunch of other folks, and basically kicked them out of the guild.

Chaplain and I re-rolled new characters, I think Nick did too, and we later swapped to a PVP server.

I never enjoyed another Warrior like that first one ever again, I never met as good a healer as I had in NW the priest again, and PVP started to take over.... I rolled my own Hunter, and outdid Sue's progress - basically as she'd moaned she couldn't gear up, I got a full tier 1 (as it was then) including the cap and bow of bones form dark-master gandling, and the chest piece form General Drakkish, I had the full set... Just to point out - spend your time and you get what you want... And I did this before she'd progressed further.

I'll be honest, it was petty, but I fancied her, it was all I had to brag about - look at my gear - eyebrows waggle - I look back on that boy and laugh now, but that doesn't mean I don't miss that kind of playing, and as you can see from this text, I still know so much about the game at that time.

I miss being the sort of thotbott for the guild, literally, anything vanilla and Alliance, and I could pretty much answer on the spot.  I had learned the game.

As BC dawned I ventured as my hunter into the new lands, and I enjoyed them, but I also started my favourite of all my characters, a female human mage... Yeah rocking the female toon now... and I went full fire, never frost, I levelled for 1 to 70 in about four weeks full fire, in every dungeon I could get to, and I got into the level 70 PVP world, especially in AV and WSG.

I also re-rolled a new Hunter, a female one, which I also levelled quickly to 70, and I played DPS classes, no more tanking from me, though I did have a level 70 warrior for the good-old times.

I was mid-way through the vast and complex attunements in BC when Wrath was announced, yes I was not a progress attainer in BC, I just enjoyed it, thoroughly.

I never high-end raid in BC, I tried sun-well a little before Wrath came out, but I said to myself I would try to Raid in Wrath.

It never happened, I took my mage, blasted through the PVE to get to 80 and went dungeons and collecting my marks, and just went full on PVP.  I had a lovely full set of Gladiator gear, when it was very hard to get that.

However, I hated arena, I hated the starting of the crossing of realms, I hated that everything felt the same.  I loved the game, and still progressed my hunter to 80, and then started a druid - who strangely got a full set of tier gear raiding Black-temple as a level 70, but I never level capped everything.

I did Gruuls lair with a guild I had joined, fell out of love with the game, and stopped playing long before Wrath's era was over.

Since then, I have played on Nostalrius, but I missed my team, I missed the feel of my original server.

I fear that many of us have rosy tinted glasses when it comes to Vanilla, what I miss is the server, the knowing everyone, the need to know everything and travel, the RPG aspects.  And I can't help but think today players want instant gratification, they are not going to await for travelling to get somewhere, they are NOT going to want to plan their hearthstone and flight routes for optimum action in a quest hub, they want to point, click and win.

Will Classic servers work?  Perhaps, will they be Vanilla servers?  No.  That time has passed into legend.

Sunday, 5 November 2017

Virgin Media : Poor Internet Speed

I'm continuing to have my issues with Virgin Media, let me explain.  Last month I came to them asking to remove the superfluous junk offering items, as I was simply paying more for items I didn't use, want nor need.

They could not help, in fact they flat refused to, so I cancelled 66% of my services off completely, TV, Phone all gone and never ever to return - well done them...

The next item was the 200mbit broadband I was paying for, below is a historic speed test I carried out a lot, as you can see I never, ever, got near 200Mbit, but I always got over 50Mbit...


Being quite a logical person therefore, and based on this evidence, this proof I had for myself, I downgraded the internet from the Vivid200 to Vivid50, i.e. 50mbit... Yeah, save money, it can do 50 mbit easy, right... Right... RIGHT?.....

WRONG.

This is tonight's speed test...


Last night it was slower than that, I had a download speed of 1.22mbit, and upload of 620K.

Yeah, rocking the 1990's internet speed.

And I'm utterly sick of it, I'm sick of paying for a service I never ever get.  It may have been optimistic, maybe even naive to think having seen 50mbit when paying for 200, I would get 50 when paying for 50, but clearly something has gone astray here.

Clearly, Virgin Media are up to some shenanigans, be that throttling, load balancing, whatever term they want to give it, or even what ever they want to say in denial, but I'm observing this poor speed, I'm enduring this poor service, and I'm utterly and totally frustrated with the script (I would say expert system, but it's clearly a script) the people in the call centers follow.

This is NOT my equipment, this is not my network, this is their external speed, this is from their superhub 2 and outwards, NOT the internal side of that connection.

That is a test through the newer version of Ookla (the site at the top) as you can see the speed is even worse twenty minutes on - BTW, note to Virgin Media, your own engineers use this site to do a test upon installation.


And here's the speed an hour after my original post... this is now 21:32 on a Sunday evening...

Thursday, 2 November 2017

C++ : Ignored qualifiers (-Wignored-qualifiers)

What is an ignored qualifier?  Well, in a class, lets have a student class:

#include <string>

class Student
{
    public:
        const std::string Name;

    private:
        bool m_Present;

    public:

        Student(const std::string& p_Name)
            :
            Name(p_Name),
            m_Present
        {
        }        

        void SetPresent() { m_Present = true; }

        void SetAbsent() { m_Present = false; }
};

Now, we may want to access the present flag, therefore provide a function to do so:

    bool Present() const { return m_Present }

This function is telling the user of our class quite a bit, it tells them that the return type is boolean and that the calling of this function makes no changes to the class  contents (const trailing the parameter list).

However, for me, this isn't quite right, I believe that we want to inform any user that the boolean returned is also constant, it does not change unless you alter the internal value with the "Set" functions, therefore I prefer and like to see code stating:

    const bool Present() const { return m_Present; }

This is perhaps overkill and most of the time completely acceptable code to present, however, some might prefer not to see it, specifically anyone defining "-Wignored-qualifiers" as with the "const bool" the const here is technically superfluous, the return type is boolean and a new instance of it, it is not a reference to the internal value, if it were the function may look something more like this:

    const bool& Present() const { return m_Present; }

Now we are intrinsically returning a reference to the internal boolean, or even:

    const bool* const Present() const { return &m_Present; }

For speed of operation we may directly drive the reference back as a constant boolean constant pointer.

I find this much more informative to the user, they know our intent, the code though more verbose communicates its meaning much more clearly.

As ever, yes I have seen questions asked of interfaces where "bool X()", or "bool X() const" is provided but then programmers have asked "How do I change X", with the const return, ignored qualifier or not, the know not to ask this function to change X and can look up elsewhere in your API.