Hacker Newsnew | past | comments | ask | show | jobs | submit | mholm's commentslogin

Adding a few diverse hardware environments available for testing during the duration would mitigate this. Many companies wouldn't have any issues having infrastructure specific optimizations either. (Part of) Deepseek's big advantage over their chinese competitors was their intelligent use of the hardware, after all.


Feels like HTX blew up out of nowhere with a ton of long form content at once, but they were huge in Chinese social media already, and finally decided to start translating previous content to english and uploading to Youtube.

My family and I binged a few of their videos. They’re so good

Oh nice, these are the guys that made the auto-aiming trash can!

[flagged]


I've sent money to creators on YouTube/Instagram, but my employer at the time had government contracts, so it's it fair to say the US government funds Factorio video content?

Sure. You nailed it.

I’m not sure I trust snap of all companies to make a good cross platform framework after how terrible their android app has been.


I think it’s been changed since, but wow was it weird finding out that instead of taking photos, the Android app used to essentially take a screenshot of the camera view.


I worked on the camera in Instagram iOS for a while. There at least, there could be a 5,000ms latency delta between the “screen preview” and the actual full quality image asset from the camera DSP in the SOC.

I don’t know a thing about Android camera SDK but I can easily see how this choice was the right balance for performance and quality at the time on old hardware (I’m thinking 2013 or so).

Users didn’t want the full quality at all, they’d never zoom. Zero latency would be far more important for fueling the viral flywheel.


> Users didn’t want the full quality at all, they’d never zoom.

Dating apps use awful quality versions of the photos you upload too. Seems to be good enough for most people.


Must increase engagement, surely.


I worked on the Snapchat Android back in 2017. It's only weird for people who have never had to work with cameras on Android :) Google's done their best to wrangle things with CameraX, but there's basically a bajillion phones out there with different performance and quality characteristics. And Snap is (rightfully) hyper-fixated on the ability to open the app and take a picture as quickly as possible. The trade off they made was a reasonable one at the time.


Things have improved since then, but as I understand it, the technical reason behind that is that it used to be that only the camera viewfinder API was universal between devices. Every manufacturer implemented their cameras differently, and so developers had to write per-model camera handling to take high quality photos and video.


:) this is exactly how we used to do it even on iOS, back in the days before camera APIs were not made public, but Steve Jobs personally allowed such apps to be published in the iOS App Store (end of 2009) ...


That was the only way to avoid the insane shutter lag that was very common on Android phones at the time. It's called SnapChat not HoldStillForAMinuteChat so it made sense.

Blame Google if you want to blame anyone. They could have mandated maximum shutter lag times (maybe they do now, I don't know).


I went through the whole blind research rabbit hole and ended up with Smartwings via Amazon. I had looked into a lot of other providers and nothing had a similar combination of reliability, cost, and customer service.


Customers are generally low-information shoppers. They go to a hardware store and ask the salesperson for a fridge that fits their requirements. The rep will show them a few options, and then the customer gets to try them out. This is where the animal brain takes over: Samsung designs for the animal brain. It's sleek. It's futuristic. There's so many doors. It has a beverage drawer. A condiment drawer. You can customize the panels. The animal knows the Samsung fridge is better, and customers likely won't know any better if the salesperson doesn't tell them (and would they? They make a better commission on the more expensive fridge)


I'm surprised we haven't seen more live AR/VR sports content. There's so much that could be enhanced via the Vision Pro if the app content were there, and Apple is clearly able to throw money around to make things happen.


The content creation engine is really only starting now, it's a pretty ripe "gold rush" opportunity for those that are willing to get their hands dirty. There wasn't even a formalised workflow until about a month ago.

Meanwhile you're spot on about spots, here's a news post from about a week ago:

https://www.apple.com/newsroom/2025/10/spectrum-brings-nba-g...


Watching sports alone is not a hobby enjoyed by most people.


I would absolutely disagree. A lot of my friends and relatives will spend their evenings (especially during basketball/football playoffs) alone watching sports. Sometimes with their partners in the room, but they're often not paying attention. It's only really the major games (final four+, super bowl, etc) that they'll gather together to watch.


Different Chip SKUs are often a TON of work. By trying to release all of them at the same time, you'll have a chip pipeline where you need tons of work, all at the same time, all in the same stages of the process. By staggering them, you spread this work out across the year.


There are a lot of annoying hurdles when allowing some types of application access. Needing to manually allow things in the security menu, allowing unrecognized developers, unsigned apps. Nothing insurmountable so far, but progressively more annoying for competent users to have control over their devices.


Apple really doesn't tell power-users about a lot of these features. You can really gain a lot by searching for Mac shortcuts and tricks. I still learn new things that have been around for over a decade.


Another tip: lots of useful characters are only an option press away. You can find them by viewing your keyboard [1], which is easy if you have you input source on your dock. Some of my favorites:

     ⌥k = ˚ (degree)         ⌥e a = á
     ⌥p = π (pi)             ⌥e e = é
     ⌥5 = ∞ (infinity)       ⌥e i = í
     ⌥d = ∂ (delta)          ⌥e o = ó
     ⌥8 = • (bullet)         ⌥e u = ú
    ⇧⌥9 = · (middot)         ⌥n n = ñ
⇧ = shift; ⌥ = option

[1]: https://support.apple.com/guide/mac-help/use-the-keyboard-vi...


This is one of my favorite features of macs and it astounds me there's nothing close in equivalence for the other platforms. The recommendations are always 'Install a keybinding app and add all of them as key bindings', as if that wouldn't take hours of tedious labor to do.


> which is easy if you have you input source on your dock.

correction: on your macOS menu bar (at the top-right of the screen along with WiFi, time/date, etc)


I'd argue if you need to be told about keyboard shortcuts, then you're not a power user. (I.e., knowing how to find keyboard shortcuts I'd consider a core trait of power users).


Keyboard shortcuts should be exposed in some fashion. IMO, Microsoft is typically better at this.


What specifically does Microsoft do that Apple should do?


My perception is that on Windows it is standard to display keyboard shortcuts next to application menu items, whereas on the Mac, that doesn’t seem to be the case. Perhaps that’s just a culture thing. It’s expected on Windows, and not as expected on Mac.


macOS does this too (if I'm following correctly), you can see it in the "The Apple Menu in macOS Ventura" screenshot on this Wikipedia page https://en.wikipedia.org/wiki/Apple_menu#/media/File:Apple_m... it's done for both application keyboard shortcuts, as well as system shortcuts (as in this example).

For completion, system shortcuts are also available in `System Settings > Keyboard > Keyboard Shortcuts...` (where they can also be changed). (Although I don't think that's 100% comprehensive, e.g., I don't think `⌘⇥` for the application switcher is listed there.)


It's talking about the specific content being watched, right? Could a media company release a silent episode, then if any ad with noise is played on it, file suit?


I think in the standard they use for calculating the normalized loudness (bs1770) is technically undefined for silent content


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: