You know that crushing feeling when you are monitoring your architecture and you know you are about to lose the site and there is little you can do about it. So the site goes down, the business screams and you start the long process of identifying what exactly went wrong.
That analysis is where this post started. You can spend a long time digging through code and logs to get to the bottom of the problem and then find at the core of the matter some of the most ludicrous, thoughtless coding errors imaginable. More often than not this is caused by 3rd Party extensions that you installed in order to get some functionality out quickly.
I now realise that quick approach is really an anti-pattern throwing in an additional layer of complexity not of your own making. So not only are you going for speed over anything else you are also increasing your ignorance of the platform you are building. I don’t like complexity and I certainly do not like being ignorant of the platform that we are building.
Anyway back to the code problem. How about a module that every time you loaded a product page scanned through every record in the order_items table to count how many times that product had been ordered. Now on a test environment or a vanilla Magento install it will be fine. On a platform that has been running for years just watch the server grind slowly to its death. All because the developer thought it would be a good idea to show the user how many of a particular product had been sold. I like the idea and the functionality but surely someone would have suggested doing this on a cron or something and storing the results. Does it matter if it is a day out? This particular module was in the Magento marketplace as well.
And on to the second module, which decided to ignore the flat tables in the database and create a query against the EAV tables of such complexity that it toook us 3 days to decipher what the hell it was doing. Of course the problem was created because the developer just used the ORM to get all the data they thought they might need, which I will rant about in another post. The result was a query that at best took 4 seconds to run and at worst took 10 seconds. Again this was a module that was available in the Magento Marketplace.
In conclusion, do not go for speed of delivery, go for speed and stability of the site or you will be paying twice for that module.
I tried very hard to go through the books I have for Clojure and event go to the point of thinking some of the concurrency and parallel programming stuff was really very cool. But I have been a way for a bit and read nothing on Clojure and feel like I have got to that point where I will stop for a while.
My Masters starts up again soon and Ireally do not want to screw up my dissertation so I might go back to it when I have completed that. But then perhaps I wont. It is so different to anything I have ever tried to do.
So I am a firm believer in trying to learn a new computer language each year and so this year I am trying Clojure. And trying is probably the best I can do.
The last time I used a Lisp was back at University and rarely got above trivial. So I have come to it almost fresh. I have to say the concepts are fine and I like the idea of functional programming and the immutability but as soon as it gets away from the elementary level I struggle to get my head around it all.
I have bought books (4 in fact) and have been trying theclojure koansand also the 4clojure web site but still struggle with anything from the more difficult sections.
So at the moment I am the ignorant apprentice waiting for the “ah ah” moment of Clojure enlightenment. I am yet to get to the despondent stage where I give up but feel it is not too far away.
Why am I writing this post? Because I have worked in SMEs for a while and the smaller and younger they are the more I worry about their attitude and ability to respond to security events. In fact to event identify that they have had a security based event.
This is been made worse in the last twelve months with the additional responsibilities of GDPR. Now there is a potential massive financial implication of a data breach, not just from the cost of remediation but also from the DCO.
Sure enough there are a lot of vendors offering people silver bullets to protect them from the evil hackers sitting out there with their hoodies covering them up in their dark rooms. But there are two major problems with that, one is that there is no silver bullet and relying on one is a big mistake. The second problem is that without security knowledge in the business they are never going to mature and be able to deal with the emerging threats that come from cyber security. The other problem with the peddlers of silver bullets is the fact that what they sell is not cheap. These small companies are concerned (generally) with building up their revenues to turn a profit. These extra security costs are a luxury. So I suppose they could use open source tools, of which there are plenty. But then we go back to knowledge and skills. Open Source tools are by their very nature more difficult to setup, let alone correctly.
Another concern is the patching and updates of equipment. In a small company is that really going to be a priority?
Even those companies that promote defence in depth and tell you that their tools should be used in conjunction with others they are just too expensive. I fully appreciate the need for the expense, building some of these tools is a long and prolonged process which costs money.
If you look at this from the hackers point of view these companies provide an ideal learning ground for future breaches. Not only that but they also provide possible jumping off points for bigger fish. Why would you attempt a breach against a massive organisation with sophisticated security prevention measure and a security team when you can go for one of their suppliers who have no security knowledge at all.
So why am I writing this post? Well because there seems to be a massive gap in the market to help these companies meet their GDPR requirements but also to protect them from hackers. I am not sure what form a solution to the problem might take but there is surely a possibility of somebody filling that gap.
There are a couple of things that concern me about it and the most pressing is security. It is just so easy to include hundreds of packages the source of which is almost impossible to track.
If those modules are maintained by people you don’t know anything about how do you know they can be trusted. How do you know if someone has not changed the code somewhere in those packages to do something malicious.
But that concern was precipitated by my very first concern. The sheer number of packages you can install without even trying to do so. Next time you play with node just check how many packages youhave just by including express and a couple of components.
And then finally perhaps the reason that appears to be the most churlish is that just because anyone can now write backend code with JS does not mean that you should. Sometimes Node just does not do the trick. If you want to create a system that processes and transforms lots of data, Node is not the right tool.
Do people really think that throwing every new piece of cool technology at a project will make it a success? I love new technology and used to spend time playing with new Frameworks etc so I could find a use pattern for them. But very few of them made them into projects because their was no value to be added from doing so.
I have seen so many projects where the business case for building them is based on keeping a technology stack up to date. Great if that is going to add value or provide a competitive advantage. But spending time and money to move your website to the latest and greatest JS framework is not a valid reason.
This is not a blog post to show what is best practice in web development but a rant against those people who use the term ‘best practice’ to defend their use of a technology.
I shall explain with a couple of examples. Yesterday at work a developer said the business could not have a web page with 5 even columns on the page because they use bootstrap which uses a 12 column grid. I tried to point out this was an issue and was told using Bootstrap was ‘best practice’.
We are building a new website and it HAS to be hosted on AWS because it is ‘best practice’.
We are building a new website in PHP and we have to use a framework because that is ‘best practice’.
I have nothing against Bootstrap, AWS or PHP Frameworks but this highlights that people are building a solution before they have even looked at the requirements. Each project is unique and has unique requirements so to come to the conversation with a fixed mindset of what solutions will work is just plain wrong.
It could be that these solutions are the best thing for the project but you can’t decide that because it is ‘best practice’. You have to decide that because it is best for the project.
So I was asked at work the other day if I could help find a missing server. I know that sounds odd but it was running some software that nobody had ever needed to use but was considered essential. They did not know the IP address or the user name or password used to gain access.
I had come across nmap in the past and never really understood how useful it could be. And then as part of my studies I learned of its amazing range of facilities. So after 5 minutes of research(dabbling). I found an invocation that would show me every single open port on every single device on the network. After filtering that down a little I had some potential devices we could try.
I am not sharing the nmap invocation because that would give away what I was looking for and possibly why.
Obviously that does not get you the access you need but perhaps that is the subject of another blog post.
When I started out in development (many years ago) I wanted to write everything myself. I did not want to use any shortcuts or use other tools that would cue the time. I wanted to write those tools. And then use them myself. This was also supported by a need. I started to develop at the age of 11 when the internet was probably just another military secret. So coding things yourself was almost essential. Books were great but only got you so far.
However, as I have got older (not convinced wiser) I have got more and more lazy and see very little merit in building something when someone else is already done most of the work for you. So on my latest project I am going for very simple metric. Write as little of your on code as possible. Less chance of you adding errors that way.
It also means that you can go quicker to market than if you are trying to hand code everything. The purist in me (buried very deep) tells me that I could probably do a better job that would suit the needs of the business better. But, and this is something that only experience brings, the business I am working for don’t actually care what the technical underpinnings of the solution are. They want their solution and they will want it to be easy to maintain.
This causes a couple of problems. One writing simple code with few lines of code is incredibly difficult to do and those young developers that work for you are going to rebel and want to move on to some other more exciting (i.e. writing more code) project. But one day I am hoping they will also see that writing all of this code is only worth it when you really have to.
So I got my grades for my second module of my masters over the weekend and was pleased with the grade but a little frustrated with the lack of feedback. So on some elements I obviously did very well and others I was down at 60% and 65% but with no explanation of what areas I could have covered that would have got me those marks or areas I should have considered.
I am assuming that is it based on a grade for applying a more critical analysis of my answers but I really do not know.