So i decided to take my work back underground to stop it falling into the wrong hands 1 unit electricity cost in tamilnadu

#########

I went to UCL to study a 4-year programme in Mechanical Engineering. I wanted to work with big gas and steam turbines, for propulsion or bulk power generation. While there I realised that control systems were very interesting as well. gas stoichiometry calculator UCL has (or at least had) a policy of housing every Fresher, and as many Finalists as it could fit into the remaining space. In my first year I was in Halls, then in rented accomodation with friends from my course for the second and third years, then for the fourth and final year we all applied to go back into Halls. Everyone was accepted… apart from me. I don’t blame UCL obviously; it was just a lottery.

But that was a turning point in my life, the point at which I drifted away both from academia and that branch of engineering. In retrospect I guess I could have found some people who were in the same boat and rented a house with them, kept the immersion in college life and finished the year, but it was too easy not to. I actively avoid Java now, but in the mid-late 90s it was both cool and hot, as an early adopter I could easily get work much, much better paid than any of the big engineering companies offered their graduate trainees, and I would get to stay in London, which I thought at the time was very important. I kept working like I had been over the summer, I was living off-campus, I thought I could find a way to make it all work, but I couldn’t and before I even realized, the year was over, I had missed too many lectures and not even made a start on my dissertation. I graduated with a BEng instead of an MEng.

It would be a stretch to say I regretted any of this; some of the friends I made working for a startup that year are still close friends today, for example, and I have built a solid career in the software engineering field. But at the same time I am conscious of the lost opportunity; London and all it offered would always have been there, whereas when the final term of the final year ends, that chapter is over forever. And maybe if I had stayed in that field I would be working at SpaceX or something now! So if I have any advice for students starting this year it’s to make the most of your time as an undergrad because it will be over in the blink of an eye. But also if an opportunity is there, take it!

Professional Program, and inspired by the amazing season 2 of Westworld, I have now also completed the Artificial Intelligence track, Microsoft’s internal AI course just opened to the public. electricity receiver This combines theory with Python programming (no R option this time sadly) for deep learning (DL) and reinforcement learning (RL), leading up to a Capstone project, which I completed with Keras and CNTK, scoring 100% this time. Of the 4 available optional courses, I chose Natural Language Processing. The track also includes a course on the ethical implications of AI/machine learning/data science, something that should be mandatory for the employees of certain companies…

I had had some exposure to neural nets earlier but this was my first encounter with RL, and that was easily my favourite and the most rewarding part, and definitely something I want to explore further, with tools like OpenAI Gym. A fair amount of independent reading is needed to answer the assessment questions in this and the other more advanced courses; obviously I was not looking to be spoon-fed but it would have been better for it to be self-contained. Rumsfeld’s Theory applies here; if you don’t know what you don’t know, how can you assess the validity or currency of an external source? Such as what has changed in Sutton & Barto between the 1 st edition (1998) and the 2 nd (October 2018, so not actually published yet!) , and which one was the person who set the assessment questions reading? Many students raised this concern in the forum and the edX proctor said they were taking the feedback on board so perhaps by the time any readers of this blog come to it, it will be improved. The NLP course was particularly bad for this, I wonder if something was missed when MS reworked them for an external audience? So frustrating when it is such an interesting subject!

Obviously there is not the depth of theory in these relatively short courses to do academic research in the field of AI. Each of the later courses (7-9) takes a few weeks but to go fully in depth would take a year or more. But there is certainly enough to understand how the relevant maths corresponds to and interacts with the moving parts, and to confidently identify situations or problems DL and RL could be applied to, and to subsequently implement and operationalize a solution with open source tooling, Azure, or both. electric utility companies charge customers for Overall I am pretty happy with the experience. I learnt an awful lot, and have plenty of avenues in addition to RL mentioned previously to go on exploring, and have picked up both a long term foundation and some skills that are immediately useful in the short term. Understanding the maths is so important to be able to develop intuition, and is an investment that will continue to pay off even as the technologies change. Working on this part time over several months, I am very conscious that a lot of this stuff is quite “use it or lose it”‘ so I will need to maintain the momentum and internalize it all properly. For my next course I think I’ll do Neuronal Dynamics or maybe something purely practical.

Oh, and I previously mentioned that I had finally upgraded my late-2008 Macbook Pro to a Surface Laptop. The lack of a discrete GPU‡ on this particular model means that the final computation for the Capstone took about an hour to complete… On a NC6 instance in Azure I am seeing speedups of 4-10× on the K80, which is actually less than I had expected, but still pretty good and I expect the gap would open up with larger datasets. I think I will stick with renting a GPU instance for now, until my Azure bill indicates its time to invest in a desktop PC with a 1080, I’m just not sure that it makes sense on a laptop. Extensive use is made in these courses of Jupyter Notebook, which when running locally is pretty clunky compared to the MathCAD I remember using as a Mech Eng undergrad in the ’90’s, but there is no denying that Azure Notebooks is very convenient, and it’s free!

To be successful in tech, it’s well known that you must keep your skills up to date. The onus is on each individual to do this, no-one will do it for you, and companies that provide ongoing personal development are few and far between. v gas station Many companies would rather “remix our skills”, which means laying off workers with one skill (on statutory minimum terms) and hiring people with the new skill. Which is short-termist in the extreme; the new workers are no better than the old, they just happened to enter the workforce later, and the churn means there is no accumulation of institutional knowledge. If you were one of the newer workers, why would you voluntarily step onto this treadmill and if you were a client, why would you hire such a firm when it provides no value-add over just hiring the staff you need yourself? Anyway, I digress.

It is clear that C++11 was a enormous improvement over C++98. The list of new features is vast and all-encompassing, yet at the same time, backwards compatibility is preserved. You can have all the benefits of the new while preserving investment in the old (“legacy”). Upgrading your skills to C++11 was a very obvious thing to do, and because of the smooth transition, you could make quick wins as you brought yourself up to speed. That is just one example of the sort of thing I am talking about. electric zap sound effect free You still need to put the effort in to learn it and seek out opportunities to use it, but the path from the old to the new is straightforward and there are early and frequent rewards along the way, and from there to C++14, 17, 20…

But I look around the current technology landscape and I see things that are only incremental improvements on existing programming languages or technologies and yet require a clean break with the past, which in practice means not only learning the new thing, but also rebuilding the ecosystem and tooling around it, porting/re-writing all the code, encountering all new bugs and edge cases, rediscovering the design patterns or new idioms in the language. The extent to which the new technology is “better” is dwarfed by the effort taken to use it, so where is the improved productivity coming from? Every project consists of either learning the language as you go, or maintaining and extending something written by someone who was learning the language as they went, perhaps gambling on getting in on the ground floor of the next big thing. But things only get big if people stick with them is the paradox!

So I am pretty comfortable with my decision to mostly ignore lots of new things, including but not limited to Go, Rust, Julia, Node.js, Perl6 in favour of deepening my skills in C++, R, Python and pushing into new problem domains (e.g. ML/AI) with my tried and trusted tools. When something comes along that is a big enough leap forward over any of them, of course I’ll jump – just like I did when I learnt Java in 1995 and was getting paid for it the same year! I had a lot of fun with OCaml and Haskell too, but neither gained significant traction in the end, also Scala. I don’t see anything on the horizon, all the cutting edge stuff is appearing as libraries or features for my “big 3” while the newer ecosystems are scrambling to backfill their capabilities and will probably never match the breadth and depth, before falling out of fashion and fading away. I’ll be interested in any comments arguing why I’m wrong to discount them, or any pointers to things that are sufficiently advanced to be worth taking a closer look at.

Why did we (developers) flock to Macbooks? Even if using platform-agnostic languages and/or writing applications that would run on servers, we wanted portable Unix workstations with a high build quality and none of the hardware compatibility issues that come with trying to run Linux on a laptop. It’s been over 20 years since I first tried it and it is still woeful. The only way to run Linux on a laptop, even now, and not lose your mind is as a virtual guest of Windows or OSX. And with OSX all the power of Unix is right there already, great!

But Apple have really dropped the ball recently. The build quality isn’t there anymore, the CPU/GPU/memory specs of the MBP are lagging†, and there is a new player in town: Windows Subsystem for Linux. And it is seriously impressive, super-slick and Just Works™. Debian is available, you can develop for it with Visual Studio. There are still a few things to iron out – I still haven’t quite figured out how to have a single project that can target both – but no need to run a heavyweight, high-overhead VM or even a container, it’s deeply integrated with Windows, the experience is pretty seamless. I’m running it on a Surface Laptop now and by the way, I love the keyboard and I love the screen on this device. My first new laptop since 2008…

I think this is going to cost Apple a lot of developer mindshare, as long as MS manages not to screw up their acquisition of GitHub‡, and where the devs go the apps go and the users follow. I saw first hand a decade ago in the wholesale migration from SPARC/Solaris to Linux on x86 that a superior OS can’t save a vendor if they don’t have a good hardware story, and it’s not as if OSX can claim to be far ahead of Windows anymore. What amazing new feature did they demo at WWDC – the animated poop emoji??

… and shows how they all fit together into a “big picture”. Obviously the course is run by Microsoft via edX and does make use of some Microsoft technologies such as Azure ML Studio but it is not actually particularly Microsoft-centric. The maths is universal and most of the programming is in open-source languages, for example I completed the final Capstone project with the free RStudio on my late-2008 MacBook Pro (achieving a final score of 97%).

So I definitely recommend this course (and it’s free if you don’t care about getting a cert at the end, and doesn’t require owning any high-end hardware, all you need is time and self-discipline). I think there is a lot of data science hype around right now, and a lot of unrealistic expectations both from data scientists and organisations employing them. I am certainly not planning on any abrupt career changes myself! But when the smoke clears and the dust settles, these kinds of skills will be applicable to all industries and most roles, even if the job title isn’t Official Data Scientist. gas house eggs Data munging/wrangling (or “ETL” to use the fancy term) is something I’ve done my entire career for example, but I haven’t previously done much dimensionality reduction or feature engineering, and I do forecasts of things all the time, so I will be looking to apply some of that perhaps.