Hacker Newsnew | past | comments | ask | show | jobs | submit | more blobbers's commentslogin

"There’s a reason we’re not seeing a “Startup Boom” AI skeptics ask, “If AI is so good, why don’t we see a lot of new startups?” Ask any founder. Coding isn’t even close to the most challenging part of creating a startup."

-- uhhh... am I the only one seeing a startup boom??? There are a bajillion kids working on AI start ups these days.


Startups optimize for things that secure initial seed money not things that stand up on their own two feet. That is why most fail. What VCs invest in and what succeeds is not really correlated.


Nope, you're spot on.

AI deals continued to dominate venture funding during the third quarter. AI companies raised $19 billion in Q3, according to Crunchbase data. That figure represents 28% of all venture funding.

The fourth quarter of 2024 has been no less busy for these outsized rounds. Elon Musk’s xAI raised a behemoth $6 billion round, one of seven AI funding rounds over $1 billion in 2024, in November. That’s just months after OpenAI raised its $6.6 billion round.

https://techcrunch.com/2024/12/20/heres-the-full-list-of-49-...


Yeah - it's basically like the days when everyone added "Deep" to their name to get an extra 0 on their valuation.

What do you do? Oh we do DeepCoffee brewing. It's a coffee machine powered by Deep Learning to brew the perfect cup. Keurig and Starbucks are Yahoo, and we're Google. (now people probably say those guys are google and we're openai but I digress)


It's a real research paper, but a bit of a hokey one.

The spam blog is just promoting it. https://pubmed.ncbi.nlm.nih.gov/40516525/


The internet says 100W idle, so maybe more like $40-50 electricity, depending on where you live could be cheaper could be more expensive.

Makes me wonder if I should unplug more stuff when on vacation.


I was surprised to find out that my apartment pulls 80-100W when everything is seemingly down during the night. A tiny light here and there, several displays in sleep mode, a desktop idling (mere 15W, but), a laptop charging, several phones charging, etc, the fridge switches on for a short moment. The many small amounts add up to something considerable.


I got out of the homelab game as I finished my transition from DevOps to Engineering Lead, and it was simply massively overbuilt for what I actually needed. I replaced an ancient Dell R700 series, R500 series, and a couple supermicros with 3 old desktop PCs in rack enclosures and cut my electric bill nearly $90/month.

Fuckin nutty how much juice those things tear through.


Yeah it kinda puts it all into perspective when you think of how every home used to use 60-watt light bulbs all throughout. Most people just leave lights on all over their home all day, probably using hundreds of watts of electricity. Makes me realize my 35-65w laptop is pretty damn efficient haha


100W over a month (rule of thumb 730 hours) is 73kWh. Which is $7.30 at my $0.10/kWh rate, or less than $25 at (what Google told me is) Cali’s average $0.30/kWh.


Your googling gave results that were likely accurate for California 4-5 years ago. My average cost per kWh is about 60 cents.

Rates have gone up enormously because the cost of wildfires is falling on ratepayers, not the utility owners.

Regulated monopolies are pretty great, aren’t they? Heads I win, tales you lose.


60 cents per kWh? That’s shocking. Here in Oregon people complain about energy prices and my fully loaded cost (not the per kWh but including everything) is 19c. And I go over the limit for single family residential where I end up in a higher priced bracket. Thanks for making me feel better about my electricity rate. I’m sorry you have to deal with that. The utility companies should have to pay to cover those costs.


Depends entirely on the utilities board doing the regulation.

That said, I'm of the opinion that power/water/internet should all be state/county/city ran. I don't want my utilities companies to have profit motives.

My water company just got bought up by a huge water company conglomerate and, you guessed it, immediate rate increases.


Most utilities, even if ostensibly privately-owned, are profit-limited and rates must be approved by a regulatory board. Some are organized as non-profits (rural water and electric co-ops, etc.) This is in exchange for the local monopoly.

If your local regulators approved the merger and higher rates, your complaint is with them as much as the utility company.

Not saying that some regulators are not basically rubber stamps or even corrupt.


I agree. The issue really is that they are 3 layers removed from where I can make a change. They are all appointed and not elected which means I (and my neighbors) don't have any recourse beyond the general election. IIRC, they are appointed by the governor which makes it even harder to fix (might be the county commissioner, not 100% on how they got their position, just know it was an appointment).

I did (as did others), in fact, write in comments and complaints about the rate increases and buyout. That went unheard.


CORE energy in Colorado is charging $0.10819 per kWh _today_

https://core.coop/my-cooperative/rates-and-regulations/rate-...


They have definitely increased but not all of California is like this. In the heart of Silicon Valley, Santa Clara, it's about $0.15/kWh. Having Data Centers nearby helps, I suppose.


I'm guessing the parent is talking about total bill (transmission, demand charges..) $.15/kwH is probably just the usage, and I am very skeptical that's accurate for residential.


Correct. $0.15/kwh is usage. There are a few small fees but that’s likely the case in most places. This is residential use. If skeptical, a quick online search is all it takes…


Santa Clara's energy rates are an outlier among neighboring municipalities, and should not be used as an example of energy cost in the Bay Area. Santa Clara residents are served by city-owned Silicon Valley Power, which has lower rates than PG&E or SVCE, which service almost all of the South Bay.


Well the discussion was California as a whole and averages, so I decided to share. As with averages, data is above and bellow the mean, so when a commenter above said $.30/kwh was much too low for California, I decided to lend some support the the argument as I’m in California paying bellow the average. It’s a just a data point. A counter example to the claim made by parent. Maybe it helps fellow nerds pick a spot in the bay if they want to run their homelabs.


100W continuous at 12¢/kWh (US average) is only ~$9 / month. Is your electricity 5x more expensive than the US average?


The US average hasn't been that low in a few years; according to [0] it's 17.47¢/kWh, and significantly higher in some parts of the country (40+ in Hawaii). And the US has low energy costs relative to most of the rest of the world, so a 3-5x multiplier over that for other countries isn't unreasonable. Plus, energy prices are currently rising and will likely continue to do so over the next few years.

$50/month for 100W continuous usage isn't totally mad, and that could climb even higher over the rest of the decade.


Not OP, but my California TOU rates are between a 40 and 70 cents per kWh.

Still only $50/month, not $150, but I very much care about 100W loads doing no work.


Those kWh prices are insane, that’ll make industry move out of there.


Industrial pays different rates than homes.

That said, I am not sure those numbers are true. I am in California (PG&E with East Bay community generation), and my TOU rates are much lower than those.


There are 3 different components of PG&E electricity bills, which makes the bill difficult to read. I am also in PG&E East Bay community generation, and when I look at all components, it’s:

Minimum Delivery Charge (what’s paid monthly, which is largely irrelevant, before annual true-up of NEM charges): $11.69/month

Actual charges, billed annually, per kWh:

  Peak NEM charge: $.62277
  Off-Peak NEM charges: $.31026
Plus 3-20% extra (depending on the month) in “non-bypassable charges” (I haven’t figured out where these numbers come from), then a 7.5% local utility tax.

Those rates do get a little lower in the winter (.30 to .48), and of course the very high rates benefit me when I generate more energy than I consume (which only happens when I’m on vacation). But the marginal all-in costs are just very high.

That’s NEM2 + TOU-EV2A, specifically.


Are you actually able to compute that? With PG&E + MCE because of the way they back off the PG&E generation charges, the actual per-time period rates are not disclosed.

I can solve for them with three equations for three unknowns... but since they change the rates quarterly by the time I know what my exact rates were they have changed.


If he’s only paying $50 most of it is connection fees and low usage distorting his per kWh price way up.


> Makes me wonder if I should unplug more stuff when on vacation.

What's the margin on unplugging vs just powering off?


That also depends on the country you live.

The EU (and maybe China?) have been regulating standby power consumption, so most of my appliances either have a physical off switch (usually as the only switch) or should have very low standby power draw.

I don't have the equipment to measure this myself.


By "off" you mean, functionally disabled but with whatever auto-update system in the background with all the radios on for "smart home" reasons - or, "off"?


Depends on a server. This test got 79W idle for _two socket_ E5 2690-V4 server.

https://www.servethehome.com/lenovo-system-x3650-m5-workhors...


It looks more like C++ or C now. Never really became a java programmer. Hopefully never will have to.


Back in 2005 or so, Java felt like "C++, but we know better" to me. I had no choice but to use it, but boy did they omit some stuff that was actually useful. The boilerplate back then (driven by both the limitations of the language and a surge in architecture astronauts) was absolutely insane. Don't even get me started on anything labelled "Java Enterprise" in the 2000s, bloat, slowness and bugs everywhere. IBM, Oracle and friends were running quite the circus. Creating working, performant software in that environment was a brutal challenge. I think Java, the language, is only partly to blame, but it was limited enough to spawn pretty complex code generators and frameworks.

Took the industy a decade or two to arrive at typed languages that were an actually good subset of C++ features. Java, in my opinion, wasn't one.


Did you DIY? Our installer basically only put panels on the sunniest sides of our home. I'm surprised you went with panels on all sides.


We didn't DIY. Our installer was happy to put the panels wherever we wanted.

Our roof is an even East/West split. So one side powers our morning and the other side our afternoon.


It's really interesting that you still get a smooth production graph with east/west, I wasn't sure what that would look like.

We have east and south and our peak production is when the sun is at the south-east, in the middle of the two faces. The east face production drops off from there on out until it's a fraction of what the south face is generating.


You might want to start playing factorio ;-)

The answer is yes: it is a lot easier to make a PV farm than a home scale install!


I don't get the joke. I mean, I understand factorio. But in factorio, "home scale" is extra easy. You can craft a handful of solar panels in your back pocket, no need to even think about setting up a production line.


On the long road to 1M SPM already :)


To be fair, Apple Watch battery life is atrocious compared to competing models. Their marketing and ecosystem is better.


IMO there's a gap between "charge every day" and "charge once a week" that needs to be crossed.

In other words, if they made the battery last twice as long it'd still be equally as annoying (since your daily routine would be nearly the same, except now you also need to remember if it's a charge day or non-charge day).

To be fair maybe 3/4 days buys you some convenience. But anyways charging once a day is a reasonable place to get to, to get something better would require at minimum a 3x improvement which probably means a ground-up rework instead of continuous refinement.

A battery band might get you there but I suspect it'd be too clunky. At best Apple may redesign their watch to support a battery band and allow 3rd parties to make them for folks that need weeks of battery life.


For me, it comes down to two things. First, I do not want to have to charge every night since I use my watch as a silent vibrating alarm, and I track my sleep. It seems like Apple has basically overcome this hurdle, now that you can charge while you shower and basically get by.

The other issue is that I don't want to have to bring Yet Another Dongle™ every time I go away for a weekend or short business trip. Most of my trips are ≤ 4 days, so if AWs could reliably go that long (including battery degradation over time) then I'd consider getting one.

Right now, only the AWU even approaches this, and only in low-power mode. If it weren't a thousand dollars, I'd consider it. But between the low-power requirement and the pricing, it's just no contest in my book. I'm getting a new Pebble, which offers a month of battery life at 1/3 of the cost.


> The other issue is that I don't want to have to bring Yet Another Dongle™

I think reverse charging from your smartphone is a quite decent solution to the problem, which is supported by certain Android devices.


If this were possible, it would definitely make a difference for me.


I am surprised Apple doesn't sell a battery band for people who want a weeks charge.


That would be slick. Perhaps the problem is it would get hot, and possibly burn?


I watched the announcement yesterday and was very surprised to hear the watch battery life is still so shocking.

Especially considering how useful sleep data is, then I was surprised to see they're only getting sleep scores now.

My dirt cheap Huawei watches have had this for years. It's accurate enough (my own perception based on use). And I get a weeks battery life too (although I don't have the distracting fancy notifications perhaps). It does check blood oxygen levels, heart rate, stress etc.

I truly thought this was a solved problem (looking at headphones battery life, although I might need to check my assumptions here also apply to Airpods).


The watch has sleep data (for example phase durations like rem sleep and apnoe), the health app just doesn’t compute a „score“.

> I truly thought this was a solved problem.

I charge when showering in the morning. 15 minutes is enough for the day + night, half an hour to charge it fully.


That's not bad


I switched from Apple Watch to a Garmin Venu. The battery lasts for a week, and many of the sensors are more accurate.


And that's the fancy screen, gimmick edition garmin watch - the normal MIP display garmin watches (even an old, midrange Forerunner 255) will easily get a couple of weeks of battery life, more for the higher end ones.

OLED is just the wrong screen tech for these devices, never made any sense to me given how little I care about graphics and how little time I spend reading the display.


But it's not the screen that causes it to lose energy as fast, but the general purpose OS with a decent CPU.


New one is 24 hours is that still atrocious


Yes. My Pebble Steel got over a week of battery in 2015, had physical, tactile buttons that worked even wearing thick winter gloves, and had an always-on-no-matter-what screen that was clearly readable in full sunlight.

Every smartwatch that hasn't met that bar, which is almost all of them ever made, is a joke to me. I'd have ordered a RePebble had I not moved back to analogue dumbwatches instead just before they were announced (and were iOS not actively hostile to competing watch implementations).


And motorcycles get way better gas mileage than cars. But it’s still odd to frame a (totally understandable!) preference for one product category in those terms.


If you are okay with less smart smart watches, and okay with no hackability, Garmin should have a few with black and white display and >1 week battery life (even indefinite with sufficient solar).


That’s not really the same category of device


Isn’t that a laggy b&w screen, with no ability to respond to notifs, no cellular. I guess those are ok for some users


depends which camp of apple watch (or smart watch in general) users you are asking.

the camp that sees the smartwatch as an accessory to their smartphone that does fitness tracking and maybe a few other useful things to avoid pulling their phone out constantly - those people want MUCH longer battery life.

the camp that sees the smartwatch as a REPLACEMENT to their smartphone, they are perfectly fine with the current battery life.


I am closer to the first camp than the second, and I don’t understand why I would need longer battery life. The watch charges very quickly, and there is never a day when I don’t have the chance to charge at some point. I usually do it during my morning shower.


1. People use these GPS watches for Ironman triathlons, ultra running & cycling events etc. They can't and won't charge before the battery is done - and remember the battery with a daily charge will degrade significantly. If it's borderline on release, it'll be inadequate after a year.

2. Just for general convenience, having to take another special cable for every late night or overnight trip is maddening. I always have a phone anyway for any actual interactions.

I find it hard to believe many people are writing texts on their watches, it's just a nice to have gimmick feature that everyone I know has stopped using.


> and remember the battery with a daily charge will degrade significantly. If it's borderline on release, it'll be inadequate after a year.

That has not been my experience though - having used both an Apple Watch and a Pixel Watch for years on end every single day. Absolutely outside my area of expertise, but I would imagine that you can design batteries to have a much longer lifetime (no of recharge cycles) when their capacity is smaller.


That’s not how Lion charging works - degradation and lifetime (to a first approximation) depend on full charges. If you charge daily from 80% to 100% or charge every 5 days from 1% to 100%, your battery degradation and lifetime will be the same.


The new one isn't actually longer. It's just that they changed how they measure it. It assumes 16 awake hours and 8 asleep hours, so the watch lasts 24 hours, but only when you are sleeping and thus not using it for 8 hours.


Yep, easily the worst part of mine, especially since it has to charge at a different time than my phone to allow for sleep tracking.


My biggest complaint with my Apple Watch is that I have to choose between sleep tracking and being able to wear my watch all day.


Why? You can get 8 hours of sleep tracking for a 5 minute charge. You really can't charge your watch for 5 minutes before bed? How about during your bathroom routine?

You are brushing your teeth for like half that alone.


Yes, my 5 year old Garmin still lasts about 10 days. And thats with using GPS tracking + bluetooth audio for multiple recorded activities.


Yes. Simply yes for a lot of people.


Are those people who don’t need interactivity, ability to respond to notifs, cellular, etc or are you comparing with something comparable


I think a lot of people reach into their pocket and get their phone out if they need "interactivity, ability to respond to notifs, cellular, etc"

But if you want to leave your smartphone at home, but you still want cellular and notifications, I agree the apple watch is the only game in town even if the battery life sucks.


Most of this is because of the always-on screen. If you can live without it and switch back to the motion or button to wake mode, you get 30-50% more usage before the battery runs out, which is not a huge improvement but is a legitimate option.

A side effect is that this makes your watch look less new, and therefore less of a theft target.


real watches last like 24 months minimum


And bicycles go much further without needing petrol than cars.

I agree that Apple Watches don't last long enough between charges, but comparing them to a completely different class of device that's technically the same broad category is pointless.


I think the noise has to be random... so its inherently inconsistent ;) .. maybe?


Its easy to think if you can see it in both frequency and time domains.

So the fourier-transform of white noise is still.... white noise. Random is random as you say. But this has implications. That means the "wattage" of noise (ie: Voltage * Current == Watts aka its power) is a somewhat predictable value. If you have 0.5 Watts of noise, it will be 0.5 Watts of noise in the frequency-domain (after a fourier transform, across all frequencies).

The hard part of amplification is keeping it consistent across all specifications. I assume the previous post was talking about keeping white noise (which is "flat" across all frequency domains), truly flat. IE: It means your OpAmps (or whatever other amplifer you use) CANNOT distort the value.

Which is still student level (you cannot be a good EE / Analog engineer if you're carelessly introducing distortions). Any distortion of white-noise is easily seen because your noise profile weakens over frequency (or strengthens over frequency), rather than being consistent.


But most common noises are not white. you had to decolor it before.


Alternatively, you can choose a proven source of white noise.

Such as the reverse-bias shot and/or avalanche noise at the pn junction of a reverse bias'ed Zener Diode. Which is white-noise into the hundreds-of-MHz. Maybe not good enough for RDSEED, but certainly good enough and fast-enough for most hobbyist projects who are experimenting with this for the first time.


First of all, shocked people still use d3. Hasn't there been something better? It's pretty ancient by javascript standards. Do people still use jquery too? Haha... been about a decade since I touched this stuff!

Second of all, isn't it ungodly slow? I get that it can draw a few boxes nicely, and maybe shuffle them around, but I had to write my own engine using html canvas because d3 couldn't get svg to flow properly if I had thousands of pixels in my image.

Honestly, if you're going to go through the trouble of understanding d3, I would just write your own javascript canvas to animate things.


Vue is about as ancient yet people still use that. Python is even older.


Bingo!


I talked to GPT yesterday about a fairly simple problem I'm having with my fridge, and it gave me the most ridiculous / wrong answers. It new the spec, but was convinced the components were different (single compressor, for example, whereas mine has 2 separate systems) and was hypothesizing the problem as being something that doesn't exist on this model of refrigerator. It seems like in a lot of domain spaces it just takes the majority, even if the majority is wrong.

It's seem to be a very democratic thinker, but at the same time it doesn't seem to have any reasoning behind the choices it makes. It tries to claim it's using logic, but at the end of the day it's hypotheses are just occam's razor without considering the details of the problem.

A bit, how do you say, disappointing.


Clearly a skill issue where you're expecting it to know all of the specifications of a particular refrigerator model.

You didn't provide it with the correct context.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: