Thursday, 28 November 2019

From intel to AMD...

Well, that just happened... I bought a Ryzen CPU...

That may not seem a huge revelation, but let me be clear in my PC's I have always (well nearly always) had Intel CPU's.  But the current 3rd generation of Ryzen has Intel on the ropes and I have to get moving with my new PC build.

And I've gone in what may seem a strange direction.

Initially I thought about third gen Threadripper, but I really can not justify the expense, most of my work is done on servers and though I could bring it all onto my workstation for compiles (like LLVM) but really its a convenience and not worthhy of nearly £2500 (after a processor and motherboard).

So what did you do Xel?... WHAT DID YOU DO?

Well, I've gone with an AM4 socket motherboard, a very good AM4 socket motherboard, and I've gone with a 3rd Gen Ryzen Zen 2 architecture processor, but perhaps not the one you'd expect.

You may expect me to have gone with the Ryzen 9 3950X, and you'd be right sometime next year, when they're properly out and in stock and prices have homogenised some.  But today, pre-Christmas, they're like rocking horse shite and costly.

The processor I've gone with then is the Ryzen 5 3600X...

Yes, it's Zen 2, yes it's AM4... Yes it's only 6 cores and 12 threads like my current workstation machine.  But it was only £200.  It comes with the stock cooler, so I cam practice with that before delving into fitting the AIO.

I've gone with an Asus ROG top tier motherboard for this class of processor, and I plan to let it go up to the 3950X with it's 16 cores and 32 threads sometime next year.

Memory, I've gone with two sets of 2 x 16GB Corsair Dominator RGB.  For a total of 64GB of RAM, which is a massive upgrade and maxes out the motherboard.

For that's the big thing I'm giving up, if I had gone with the threadripper3, then I'd have automatically had access to double the RAM slots (8 to be precise) and 128GB of RAM is common as a maximum on the X399... But here on the X370's it's usually 64GB (though some are 128GB).

The motherboard, RAM and processor all come from Amazon, for (to me) the princely sum of £700.  So the whole new machine, with the PSU, AIO, Case and storage I already have has topped me out at £1,100.

Expect build videos and tech tinkering footage soon.

Tuesday, 12 November 2019

Second piece of kit....The PC Case

And so the purchases start to roll in, tonight I've ordered a new case and I've gone with the Corsair Obsidian 750D Airflow.


I plan to swap out the vanilla fans at the front for RGB ones and the cooling solution is likely to be a waterblock to rad with RGB too... ditto RGB on the memory.. so a little bling.

But, tonight the case, it's the all black version... it is "Obsidian" after all, with a window on the right side, so I will be able to see it all to my right as I sit at my desk.

I'm planning to make the current machine my main office server/file server, but that'll be when I move onto an all Linux base.  Yes, I'm going to retired Window as my main boot environment and go with Ubuntu.  And I'll run VGA passthrough on KVM to a Windows machine if I need to play games or run VS.

Everything else will go onto my file server, and in that direction, I've also ordered two more 4TB Drives.  I'm only getting WD Blue, I've had great personal success with WD Blues, only having one failure in about 15 years (mucho lower than Samsung, Sandisk, Hitachi and Toshiba).  And then I got excellent customer support.

So, 8TB more storage... I've also gone with a 500GB SSD, again a WD Blue.  Because I have a 500G NVME M.2 SSD I can pull out of my current machine for the main boot.

I'm sticking with my EVGA 1080 GTX SC graphics card, and I'll put the older 770GTX SC into the file server - or maybe something less power hungry.

Either way, at the moment the new machine will go into this new case and the old machine will remain in the now very aged but trusty Coolmaster Cosmos 1000, except I will eventually recase even that in a matching Obsidian, maybe one without the airflow front and I'll keep the Cosmos as a spare (you never know).


But, all this is on order... The only part to yet arrive is the power supply.

I've had some Molex to SATA death cables arrive too, but they're for a server project which I've been trying to film for you guys for a few days.

This isn't to be confused with my current under desk server, which you can see me working on the case here:


I've also started to record snippets of my review of the 6 core 12 thread laptop I've been using for the last 6 months, so I hope to start editing that together.

I will also, hopefully, this weekend be able to start showing you why there has been a large chunk of time missed in my posts... What have I been busy doing?... 

Sunday, 10 November 2019

My First Piece of Software

What was the first piece of commercial software you wrote and what did it do?

This was a good question I was asked by a student interviewing me about my entry into being a software engineer, so I thought I'd share the answer.

It was a PC auditing software application, written in Turbo Pascal for DOS, it scanned your files identified a bunch of known executable and produced a finger print of your machine (linked to either the network Mac address or the BIOS serial number) it showed a number on the screen and I had a silver (label) punch to put that number into a hard metal sticker and put that onto the PC case.

This was used by Claremont Garments* (by me actually) to audit most all the machines they had, it was related to a commercial product we'd bought to do a similar task, but after a bit of tinkering I found my own solution did the task better, and I could export the data to a spreadsheet to produce reports more easily and cheaply than paying for the custom import tools.

I used this software throughout the UK, I sent it to a bunch of folks working with me from the London office and then North East office and finally it went with me for my sojourn in Morocco to audit everything.

I was actually waxing lyrical just today about in order execution of the 486, that first piece of software of mine was definitely operating "in order".

To this day I have fond memories of programming in Pascal for DOS.  And though this was not the first major piece of software I'd written, it was the first used for a commercial purpose and by someone other than me.




* Interesting link to the past, the Claremont site at which I worked was the Selston Factory, which is the town in which I now live... I have fond memories of that job, and I think a fair few people around here remember that factory.  It lives on under "Lilley Close" which is built on the site, but there is a commemoration of the factory in its original "Wood Bastow" incarnation.... That maybe another post I put together, to share with you all.

Saturday, 9 November 2019

How did I halve my Build speeds?... Oh

Deconstructing my build hive... As you may tell form earlier posts this month, I'm on a mission to sort out some of my equipment and either recycle it or whatever, and on this mission one of the first stops has been the older dual socket servers, which live in the garage on an ethernet spur at the moment.

And, I've taken them down, now I didn't really appreciate how much they lent to my build times, as my main app server (on which I coordinate the building of gcc projects with distcc) started to report 4x longer builds (on average) and I couldn't find a reason for it.

I looked at the app server, I looked at the jobs being thrown about and really there was no reason for the massive slow down...

Until I remembered something about my VDI set up, the app server used two sources for it's virtual disk images, one local and one was passed over to the containers on the garage servers, these images could be mounted over NFS but more regularly I had the garage servers sync them and mount locally from their local disks.

In migrating the build containers back locally to the app server, I'd moved the containers from their local loop mounted VDI's to ones hosted on the app server... From being mounted from 15,000 RPM SAS disks at about 12gb/s to a SATA 7,200 RPM disk at 6gb/s.

I'd essentially halved the disk speed of my builds, and totally forgot about setting this all up, now I did this years ago, it must be about two years since I set it up and it's been nearly a year since I moved house and started everything back up on the ethernet spur, which necessitated using the copy/sync and local mount option to get over the slow 10mbs ethernet in the garage.

My plan therefore is to get the new power supply, experiment with the supermicroo motherboard and then see if I can set up a new ZFS based file server on that SAS controller... Watch this space.

Friday, 8 November 2019

Dull Directory Colour in Ubuntu Server

One of my problems with the latest version of Ubuntu server is the default colour for the ls command, I'd never engaged with this before, but it's a dark blue.  It's probably always been dark blue, if you perform ls with "--color=auto" you'll see it's probably dark blue, but the server always left it without the default colour... until now.

And I can't see the dark blue very well, I have reduced the amount of blue light on all my screens and dimmed things down for long hours in front of the screen, so dark blue really really doesn't work.

I have to change my folders, but I don't want to overload everything... Only the directory colour, if you take a look at the ~/.bashrc file then you will see ls colur defaults to auto:


But, I figured out that the colour will default and THEN another alias will over ride... So, I leave the auto alone and add a new alias at the bottom of the file:


This is setting the "di"rectory to colour 33, which is orange...

Reboot and this is what I now see...


The cyan colour for the files is coming from the auto, the highlight around tmp likewise.  But the orange is coming from my alias... Lovely, now I can read the server information even on this dim dim screen... eye candy!

If you want to follow more of this, check out this source.

Thursday, 7 November 2019

Doubled my All Time Views

Since September we've doubled my all time view count, it was around 250,000 total views for all time... As of the time of writing 456,000 total views.... In just over 2 months?

I'm a bit amazed.

Pageviews all time history
465,076

Wednesday, 6 November 2019

Personal PC Kit Stock Take - And First new PC Part Purchase

It begins.... A new PC build.

Yes, its that time, I simply need more power in my workstation on my desk.  I've been farming more and more work off to servers throughout the house, and I did a stock take... of all my kit... I've not done this since the house move, and indeed I documented the last stock take on these pages.

Workstation
Xeon X5670 (6 core, 12 thread) - 32GB RAM - Used all the time

Single Socket Servers
Xeon L5640 (6 core, 12 thread)- 8GB RAM - DistCC slave, Git Host & Fire Host - Used Rarely
Core 2 Duo E6750 (2 core, 2 thread) - 4GB RAM - General always on server (VPN Machine for Chaplain)


Dual Socket Servers
Xeon E5240 (2 core x 2) - 16GB RAM - Dist CC slave, Squid Host, Proxmox Host - Used Rarely
Xeon E5250 (2 core x 2) - 16GB RAM - Dist CC slave, Wow Classic Staging server - Used Rarely
Xeon E5430 (4 core x 2) - 32GB RAM - Dist CC, Dell BMC Test bed, ZFS mirror, Docker container host - Always On
Xeon E5430 (4 core x 2) - 32GB RAM - Xen Server Host.
Xeon E5340 (4 core x 1) - No RAM - Spares for other Dell Boxes
Xeon E5-2620 (6 core, 12 thread x 2) - 128GB RAM - DistCC build coordinator, main server - always on


It has to be said, nearly none of the server have been used regularly as I've just been so so busy.

Laptops
HP Omen (6 core, 12 thread) - 8GB RAM - Used Rarely
PC Specialist (4 core, 8 thread) - 8GB RAM - The Wifes machine, used as a network drive mirror
Lenovo ThinkPad E480 (4 core, 8 thread) - 16GB RAM - Main Laptop


What I decided is, a lot of this server infrastructure has got to go, and to do that the tasks they do - even in frequently have to be picked up by something, and I decided that as I usually spend my time in my new dedicated office I may as well have as much of that working here in the room with me.

The immediate points were to take all the work of the two main machines, the E5-2620 dual socket server (so 24 threads) and my workstation (12 threads), that's 36 threads of work to contend with doling out.

But, I immediately thought about keeping costs low.  The E5 class of Xeons have come down in price and have 10 cores and 20 threads, two of those maybe the way I decide to go... Forearmed with 40 threads, it's a case of getting a dual socket motherboard for my workstation, and since I game on the machine too, perhaps the slower L versions of that class wouldn't cut it... But research will be had.

128GB of RAM or more, seeing as these processors are last-gen that means DDR3 and specifically server EEC DDR3, which is fairly cheap.  And of course I have a whole bunch already in the house.  Unfortunately I have 128GB in 4GB sticks, but it's enough to get me up and testing x79 motherboards from aliexpress.

With the stock of parts I have on hand, it's therefore a the task of recasing things, making them quiet and having them bearably in the office with me.... I've been enamoured with the Corsair Obsidian range of cases for a while... But even before the case, power... 

My E5 based server is a Supermicro, but the motherboard within is actually just an EATX board, so with a spare PSU I could power it up in a new case maybe... And that's where I've decided to start... My first purchase therefore has been a new PSU.


This is platinum rated, that's highly highly efficient, it's all Japanese capacitors, it's workstation class and completely modular.

And you know what, it's only the third power supply I own, you may recall the Xmas Eve rush for a PC from a mate of mine, well I'm pretty sure that the problems there were the PSU in that machine... And I had no spare to offer, felt quite bad about that.  But now I will have spare, because my workstation will be shuffled down one notch into other devices.

The last focus I had on cases was cheapness...


The next one will be style & function.