SEEKING WORK | Remote | Onsite San Francisco Bay Area |
Versatile back end code and infrastructure development services. Typical deliverables are Dockerized microservices specified by OpenAPI, written in Go or Python, with a REST/JSON API and backed by PostgresDB or MySQL. CI/CD pipelines included and deplyment on any of the major cloud vendors. And of course your custom business logic.
If you have more complex needs, e.g. project or team management please reach out as I have extensive career experience in all aspects of of technology development and management in startups and large enterprises. Need a contract manager or CTO perhaps?
At minimum I can help you develop your back end infrastructure from the ground up. Basically I offer development of back-end componenets you can slot into your Kubernetes or Docker environment from day one.
On the business side, you get a fractional developer for a no-haggling fixed monthly rate, corp-to-corp billing, long term support for your code, careful vetting of dependencies for licensing and security, and a professional approach to your technical needs.
Location: SF Bay Area
Remote: Yes - or hybrid/in-office, I'm flexible
Willing to relocate: No
Technologies: Go, C/C++, Python, Docker, Kubernetes, PostgreSQL/MySQL, ML/LLM, and many others
Résumé/CV: https://linkedin.com/in/abnulladmin
Email: contact@nulladmin.com
I'm an engineer with many years of experience in Silicon Valley, working in management roles, engineering, and consulting. My expertise is mostly in back end systems, working in the modern Go/Docker/Kubernetes ecosystem. In addition to working directly on large and complex code projects I also managed and grew engineering teams at various companies.
If you need help with specific technical projects, or are looking for management roles please feel free to reach out. I'm very familiar with both the unique challenges of scaling up startups as well as working to innovate in large enterprise companies.
Specifically if you need help productizing and scaling out various AI/Machine Learning/LLM systems this would be a good fit. I know there is quite a bit of demand in this area and a limited supply of qualified candidates.
The worst part is that they marked the incident as resolved after 32 minutes, but didn't mention that mirrors for the packages on security.ubuntu.com have huge queues. OK fine, we can wait until the mirrors sync and you can choose another mirror to do your update - eventually. You can also work around this while updating Ubuntu 24.04 by manually installing the deb file it wants.
But wait, there's more! You can't install new instances of Ubuntu 24.04.2 because the installer connects to security.ubuntu.com by default (probably for good reasons) and will bail out while formatting and writing the disk when it gets 500 Internal Server Error from security.ubuntu.com for a specific deb file. There's no option around it that I'm aware of if you're doing an install connected to the network. I'm told that things should work if you try to install without networking connected. But that's not working for me, possibly due to some drivers it needs to pull for my hardware that are not in the default installer.
Ran into this while trying to install a fresh instance on an old mac hardware.
All in all, not a good look for Canonical, especially given how long this is taking to resolve and the lack of any status indication that this is still a problem. Lots of people are being bitten by this in the last 24 hours.
The craziest thing I've discovered is that unattended-upgrades does not timeout after failing to download pkgs from security.ubuntu.com AND will NEVER release "dpkg/lock-frontend". It will happily keep failing to download new pkgs, NEVER printing any error messages that I could see to the journal or a log file ("/var/log/unattended-upgrades"), and preventing the user from using apt because it holds a lock that it refuses to give up.
The process doesn't even respond to "systemctl stop unattended-upgrades" or SIGTERM. Only "kill -9" ends the titan grip it has over my systems.
Edit:
Out of curiosity I ran a packet capture, during the 8 minutes it was running, unattended-upgrades (apt) received 4MB and sent 182KB of packets. Given the unattended-upgrades package is installed by default on Ubuntu and the "apt-daily-upgrade" timer will run every 24 hours ((archive|security).ubuntu.com has being down for longer), I can only imagine that there must be millions of machines reaching out, repeatedly and uselessly, attempting to download new pkgs without any timeout over and over again.
Game design is the ultimate lockbox - you're unlocking an entire imaginary world which has some platonic existance in your mind.
And since you mentioned Luanti, it deserves to be much better known as a credible open alternative to Minecraft. You could do a lot worse then designing/prototyping your game with Luanti as the game engine.
The styles of Cyberpunk 2077 and Red Dead Redemption 2 are also dead givaways about their training data. There might also be a whiff of the Witcher 4 demo in one sequence.
The interesting possibility is that all you may need for the setting of a future AAA game is just a small bit of the environment to nail down the art direction. Then you can dispense with the army of workers to place 3D models on the map in just the right arrangment to create a level. The AI model can extrapolate it all for you.
Clearly the days of fiddly level creation with a million inscrutable options and checkboxes in something like Unreal, or Unity, or Godot editors are numbered. You just say what you want and how you want to tweak it, and all those checkboxes and menus are disposable. As a bonus that's a huge barrier to entry torn down for amateur game makers.
> [LLMs] spit out the most likely text to follow some other text based on probability.
Modern coding AI models are not just probability crunching transformers. They haven't been just that for some time. In current coding models the transformer bit is just one part of what is really an expert system. The complete package includes things like highly curated training data, specialized tokenizers, pre and post training regimens, guardrails, optimized system prompts etc, all tuned to coding. Put it all together and you get one shot performance on generating the type of code that was unthinkable even a year ago.
The point is that the entire expert system is getting better at a rapid pace and the probability bit is just one part of it. The complexity frontier for code generation keeps moving and there's still a lot of low hanging fruit to be had in pushing it forward.
> They're great for writing boilerplate that has been written a million times with different variations
That's >90% of all code in the wild. Probably more. We have three quarters of a century of code in our history so there is very little that's original anymore. Maybe original to the human coder fresh out of school, but the models have all this history to draw upon. So if the models produce the boilerplate reliably then human toil in writing if/then statements is at an end. Kind of like - barring the occasional mad genious [0] - the vast majority of coders don't write assembly to create a website anymore.
> Modern coding AI models are not just probability crunching transformers. (...) The complete package includes things like highly curated training data, specialized tokenizers, pre and post training regimens, guardrails, optimized system prompts etc, all tuned to coding.
It seems you were not aware you ended up describing probabilistic coding transformers. Each and every single one of those details are nothing more than strategies to apply constraints to the probability distributions used by the probability crunching transformers. I mean, read what you wrote: what do you think that "curated training data" means?
> Put it all together and you get one shot performance on generating the type of code that was unthinkable even a year ago.
>The complete package includes things like highly curated training data, specialized tokenizers, pre and post training regimens, guardrails, optimized system prompts etc, all tuned to coding.
And even with all that, they still produce garbage way too often. If we continue the "car" analogy, the car would crash randomly sometimes when you leave the driveway, and sometimes it would just drive into the house. So you add all kinds of fancy bumpers to the car and guard rails to the roads, and the car still runs off the road way too often.
what we should and what we are forced to do are very different things. if I can get a machine to do the stuff I hate dealing with, I'll take it every time.
After a while, it just make sense to redesign the boilerplate and build some abstraction instead. Duplicated logic and data is hard to change and fix. The frustration is a clear signal to take a step back and take an holistic view of the system.
And this is a great example of something I rarely see LLMs doing. I think we're approaching a point where we will use LLMs to manage code the way we use React to manage the DOM. You need an update to a feature? The LLM will just recode it wholesale. All of the problems we have in software development will dissolve in mountains of disposable code. I could see enterprise systems being replaced hourly for security reasons. Less chance of abusing a vulnerability if it only exists for an hour to find and exploit. Since the popularity of LLMs proves that as a society we've stopped caring about quality, I have a hard time seeing any other future.
>In current coding models the transformer bit is just one part of what is really an expert system. The complete package includes things like highly curated training data, specialized tokenizers, pre and post training regimens, guardrails, optimized system prompts etc, all tuned to coding. Put it all together and you get one shot performance on generating the type of code that was unthinkable even a year ago.
This is lipstick on a pig. All those methods are impressive, but ultimately workarounds for an idea that is fundamentally unsuitable for programming.
>That's >90% of all code in the wild. Probably more.
Maybe, but not 90% of time spent on programming. Boilerplate is easy. It's the 20%/80% rule in action.
I don't deny these tools can be useful and save time - but they can't be left to their own devices. They need to be tightly controlled and given narrow scopes, with heavy oversight by an SME who knows what the code is supposed to be doing. "Design W module with X interface designed to do Y in Z way", keeping it as small as possible and reviewing it to hell and back. And keeping it accountable by making tests yourself. Never let it test itself, it simply cannot be trusted to do so.
LLMs are incredibly good at writing something that looks reasonable, but is complete nonsense. That's horrible from a code maintenance perspective.
SEEKING WORK | Remote | Onsite San Francisco Bay Area |
Versatile back end code and infrastructure development services. Typical deliverables are Dockerized microservices specified by OpenAPI, written in Go or Python, with a REST/JSON API and backed by PostgresDB or MySQL. CI/CD pipelines included and deplyment on any of the major cloud vendors. And of course your custom business logic.
If you have more complex needs, e.g. project or team management please reach out as I have extensive career experience in all aspects of of technology development and management in startups and large enterprises. Need a contract manager or CTO perhaps?
At minimum I can help you develop your back end infrastructure from the ground up. Basically I offer development of back-end componenets you can slot into your Kubernetes or Docker environment from day one.
On the business side, you get a fractional developer for a no-haggling fixed monthly rate, corp-to-corp billing, long term support for your code, careful vetting of dependencies for licensing and security, and a professional approach to your technical needs.
Versatile back end code and infrastructure development services. Typical deliverables are Dockerized microservices specified by OpenAPI, written in Go or Python, with a REST/JSON API and backed by PostgresDB or MySQL. CI/CD pipelines included and deplyment on any of the major cloud vendors. And of course your custom business logic.
If you have more complex needs, e.g. project or team management please reach out as I have extensive career experience in all aspects of of technology development and management in startups and large enterprises. Need a contract manager or CTO perhaps?
At minimum I can help you develop your back end infrastructure from the ground up. Basically I offer development of back-end componenets you can slot into your Kubernetes or Docker environment from day one.
On the business side, you get a fractional developer for a no-haggling fixed monthly rate, corp-to-corp billing, long term support for your code, careful vetting of dependencies for licensing and security, and a professional approach to your technical needs.
Contact: contact@nulladmin.com
Website: https://nulladmin.com/
LinkedIn: https://www.linkedin.com/in/abnulladmin/
reply