Black Boar Security Inc. | Know It
Recent developments have highlighted a series of unfortunate incidents within the computer industry, where tampered pagers have resulted in injuries. It is essential to focus on the technological aspects rather than the intent or other related issues. This affects the reputation of the electronic industry. For approximately a decade, we have been informed that Trusted Platform Modules (TPMs) are the de jure standard for maintaining secrets. However, these incidents serve as empirical evidence to the contrary.
Copyright© Schmied Enterprises LLC, 2024.
Link of the day.
Investor intelligence. Here.
Link of the day.
Industrial Metals. Here.
Link of the day.
The cost of the design of an ARM chip. Here.
Link of the day.
Professional hosting. Here.
Link of the day.
Commercial Solar. Here.
Link of the day.
Cheaper Cadmium Solar. Here.
Link of the day.
Math GPT for Students. Here.
When tasked with securing a computer system, the optimal approach mirrors the principle of divide and conquer. This involves creating a market and enabling customers to price each component. Educational institutions, for instance, utilize both laptops and paper for students. This strategy prevents any single industry from monopolizing the market, thereby reducing prices.
Another viable strategy involves modifying compliance rules to permit the storage of secrets both within and outside of TPMs. Statistical analysis can readily identify the areas where TPMs are insecure and the geographical regions that are safer. Customers can then choose and price these systems accordingly.
This strategy underscores the fundamental principle of computer security: awareness of your system's vulnerabilities is crucial.
For highly critical systems such as gas and oil pipelines, the de facto standard involves shipping systems with plain TCP sockets devoid of security. This places the responsibility of stability on the engineering firm, allowing them to concentrate on reliability. The distributor and the client can then establish OpenSSL or a similar library based on their unique standards and legal requirements.
This system offers several advantages. The local administrator retains complete control over the SSL socket closures in every application within a system. This explains the ubiquitous inclusion of OpenSSL in systems that could implement them independently, such as Java or Golang. However, this also presents a single point of failure that hackers can exploit.
Since their introduction to the public approximately a decade ago, random numbers have become the de facto standard for general penetration attacks. Tampered random numbers can undermine any encryption method.
So, how can one ascertain the security of their system? Statistical analysis is key. Consider the number of developers capable of managing codebases comprising tens of millions of lines. Statistics indicate that such large, complex systems will inevitably contain vulnerabilities. While AI can assist in identifying these vulnerabilities, it can also aid attackers and potentially present its own issues.
As a professional from the embedded world, I maintain that any secure and reliable feature should comprise no more than approximately eight hundred lines of code. This is a manageable amount for auditors.
Does this imply that traditional Unix administrators were misguided? Absolutely not. They managed systems connected to the internet and endeavored to rectify any emerging issues. Their approach was data-driven, which resulted in overfitting to the attackers of their time. As systems became more complex, they eventually covered the typical attacker types. Unix administrators gained knowledge, increased their salaries and tenure, and earned a strong reputation and trust.
However, upon conducting training sessions, it becomes evident that most methods have inherent flaws. Hashing passwords assumes that attackers have infiltrated the system and collected hashes. Could they also alter random number generators or authentication logic? Defense in depth is a suitable response for management, but it carries risks.
Passwords assume that the attacker lacks access to a camera in any location where a laptop is used. However, even the most affordable phones are equipped with cameras today.
Our statistical reasoning has led to the following rule: conduct security audits and gather data on your systems. This can reveal attack types. Attackers depend on secret backdoors, and any audit increases the risk associated with their use. If a backdoor is fixed, they must incur the fixed cost of finding or creating a new one.
The more expensive a vulnerability is, the fewer will be available, according to the basic economic principle of supply and demand. Eventually, there will be so few that law enforcement can manage these groups with the limited taxpayer dollars at their disposal.
Consequently, large big data databases can be beneficial. While it may not be possible to identify all backdoors in a codebase comprising millions of lines, it is feasible to make any attacks significantly more costly. The second rule for software vendors is to avoid handling security independently. Instead, delegate it to an external shim, wrapper, or antivirus. This allows each administrator to cover all security checkpoints to the same standard and fix any vulnerabilities with a single update."