That Small Step
I am reflecting on the recent Nobel awards granted to researchers David Baker, Demis Hassabis, and John Jumper partly in the field of artificial intelligence. It is an achievement that was difficult to predict in advance.
Copyright© Schmied Enterprises LLC, 2024.
Link of the day.
Professional profile images. Here.
Link of the day.
Write your AI here. Here.
Link of the day.
Or (Not XOR) Write your AI here. Here.
Link of the day.
Child Development. Here.
Link of the day.
Evacuation Management. Here.
Link of the day.
Data Protection. Here.
Link of the day.
Find a song. Here.
Link of the day.
Share a Sandbox for Training. Here.
Link of the day.
Vector Databases for AI. Here.
Link of the day.
AI UI with Langflow. Here.
Link of the day.
Langflow apps. Here.
Link of the day.
AI tutorial. Here.
Regulatory.
Intellectual Property. Here.
I majored in enterprise resource planning systems. Back in 2002, I opted against specializing in artificial intelligence, believing the machines were too underdeveloped at that time. After graduation, I joined a GPU manufacturing company. We built multi-GPU systems for infrastructure control room systems, such as those for oil and gas pipelines and power plants.
We accomplished some notable projects, including equipping many Microsoft meeting rooms with our video equipment. I personally built the device drivers for video streaming and graphics processing from scratch. This invaluable experience led me to join Microsoft's top teams in the Windows operating system department. This expertise enabled me to work in five different teams over eight years, spreading core Windows technology across Xbox, HoloLens, and Azure.
Eventually, I joined Cloudera, a company largely supported by Intel and public investors, to the level of roughly a billion dollars. I was initially skeptical. The company did not perform well, and my incentives, alongside share prices, dropped significantly. At that time, we were developing the clustering software Hadoop, which was utilized by numerous AI applications. By 2018, it became clear to me that something was amiss. We could collect data for research, individuals, and computer systems, but maintaining the expenditure in research was prohibitively expensive. Once a key learning is proven, clusters can be dismantled.
Learning about individuals always encounters the same privacy issues. You try to collect data that individuals already know and attempt to solve their problems without asking them directly. Privacy also introduces numerous regulations. The domestic product per person has not increased significantly in most countries, so where does the return on investment come from? Nobody likes to be controlled. Eventually, I realized that tracking infrastructure, logs, and computer systems would be the key to revenue growth.
I had a brief interview with a company developing neural network-based codecs using generative logic at that time. I did not accept the offer for some reason and instead did a round at Amazon before returning to Cloudera. Nonetheless, the generative logic impressed me by regenerating a very similar image on the decoder side. I did some research and even wrote an article about generative AI, predicting that it would likely improve the efficiency of expensive software development teams. Honestly, we were building software back then like guilds in the Middle Ages.
Some of my arguments stem from my time at Microsoft. Only simple things can be sold en masse. I realized that English would be the future language of coding. I started a GitHub project called Englang (Engineering Language) but never continued it. I believe it was not an accident but a mindset that led to Microsoft's early advertising of the technology.
Four years have passed since then, and much has changed. I declined funding twice for my startup due to terms, but I can now pay for and use two $20-a-month artificial intelligence models that are as powerful as two $200k-a-year software developers. I have saved almost $2 million in software development costs since founding the business. Impressive. The opportunity cost of my investment in Cloudera eventually paid off.
What is next? Nobody knows. Technological advancements usually take time to spread globally. Wars hinder progress and reshape society. I believe the spread of artificial intelligence will provide us with income for a decade or more. I am eagerly looking forward to the standardization of AI chips. The great thing about the IBM PC was that the Intel x86 instruction set was standardized and implemented by AMD. This lowered costs, enabling PCs to be sold to every household. Not just that, Apple built on IBM technology to create its stack, making the software space competitive.
GPUs and TPUs were useful for training artificial intelligence models. Standardization can help achieve more. Last year, we demonstrated that it is possible to train a model to implement a regular microcontroller core on a set of registers. Graphics cores can then be used to replicate what Intel processors used to do, with dozens to hundreds of them on a chip instead of four or eight. It is akin to training a model to follow the logic of an x86 core. LLMs do the same logic predicting the next word based on context and the last sentences. CPUs do the same having a data context, and an instruction stream
Energy is a major issue, so I am planning my company's business to rely on energy sources, probably solar, renting data center hardware, possibly GPUs in containers. Then, I can focus on what I enjoy, friendly displays, handheld systems, and user interfaces.