2022, secure?

First of all, I want to wish you all a healthy 2022.

First of all, I want to wish you all a healthy 2022. I hope that the pandemic will become epidemic or - better - ends.

Of course, real life is the most important. But in the virtual world of computers and software, a lot is going on as well. Security issues are quite epidemic. A short while ago, Log4J shocked the world, and although it is a Java issue, it does not mean that PHP is safe and that the threads are over. If you look at the impact of Log4J, you could also say that this is a pandemic. And you don't need to be a magician to predict that we will find more of those issues in 2022. Security issues always is a common interest and a rat-race between the hackers and legal world.

The issue of Log4J brings an interesting aspect to the table. Log4J is widely used and is part of the open-source community. And that is exactly what the discussion is all about. In this case, the world depends on a handful of - usually - unpaid volunteers to fix this issue. Anybody can propose changes to GitHub, so the origin is not always clear. If you write complex software, it is not that strange that changing something in one corner affects the software's working in another corner. Even today, new patches of the Log4J are released.

What does that mean in the software development world? Do we have to go back to the situation where developers wrote everything themselves? Personally, I don't believe in that. Looking at Log4J, you could say that they overkilled the functionality. I saw some comments saying that the software was a mess and should be built by layers upon each other. If you only needed a small part of the functionality, like 'just logging' why implement a complex routine. I don't know. I'm not a Java Guru. But the fact that Java loads classes from outside, based on dependencies, is something that I think has its risks, but I don't want to dive too deep into this particular situation, as I'm not an expert on this.

The discussion here is, what are the risks if you use software development products like PHsPeed. What risks do you introduce in your architecture? It applies to all products, not only PHsPeed. If you use Visual Studio, Eclipse, or other code generators, they all use standard components. PHsPeed makes use of (mainly) MIT licensed components. If these components have issues, then who's to blame?

First of all, we from the PHsPeed development team are very aware of the security risks. At every step we do, we try to imagine its risks. We keep track of sites like OWASP to see if there are new developments, and we keep track of updates of the included third-party libraries. But the chain is as strong as its weakest link. If there is a security breach in one of the components, and that component is in use, you have a breach. It is as simple as that.

So how do you mitigate?

  • If you find out, contact us. We might or might not yet be aware of the problem. We will investigate the issue and try to find solutions.
  • Because of the component-based and OOP approach, changes in one module apply to all your projects. You only need to regenerate your code and deploy it. Meanwhile, you might want to disable a part of the functionality that has the issue.

What if the issue is not fixable or can be fixed in time?

It is hard to say what we will do because it depends. We might search for an alternative component to replace the original. That is not always easy. We might see if we can fix the problem in the code. We are involved in some open-source developments (and have been for the past 20 years or so). However, we do not have the resources to interfere with every library out there. In the worst-case scenario, we might drop the component without replacement. Either way, security is critical to us, and if there are issues, we will drop everything we are doing and focus on this issue. That is the guarantee that we can and do give!

And the consequences for you, as a developer? If you build applications, then you test them. Nowadays, it is also essential that you test for security issues too. You cannot delegate that responsibility to your supplier of tools. There is no way for your supplier to guarantee that you don't introduce issues because of your code, third-party components, or even the code it is generating.

In the EU privacy laws, it is stated that the owner of the data is responsible for its security. That means that if you write applications that maintain sensitive data, and you sell the application that your customer is responsible if data gets leaked. In fact, it is a similar situation between you as the developer and the supplier of tools.

But overall, security is a matter of us all. Applications will never be 100% safe. But if we all do our best and stay focused then we can come close.So if in doubt, contact us!

For 2022 we have a lot of plans, and 2.1 is already in the build. On our design table is some functionality that should improve security even more and provide some basic tooling to allow pen testing on your products. We hope to introduce them in one of our future releases.  And don't forget, we listen to you! If you are missing functionality, or have bright ideas, please let us know.

Happy coding!

31 Dec 2021 Blog None