A lot of Japanese learners do hate katakana (personally, a lot of fonts could stand to be clearer about ツシンソ ), because most writing is in kanji+hiragana so they have less practice with katakana. But kana ability is really just exposure. Use it to get used to it.
Same reason people say kana-only writing (like in old videogames for example) is hard to read: People competent at reading any language don't spell things out in detail, even when we subvocalize we first recognize the shape of the scribbles and our brain has a shortcut from a certain set of scribbles to certain morphemes/words, where the solid feeling of meaning comes from.
Every competent reader of Japanese is first and foremost used to the kanji-hiragana mixed script, and has shortcuts for the kanji forms of words and the sounds of those words. The hiragana only forms? Not so much. So when they complain about hiragana only being hard to read, they're not lying. It really is harder. But it's not harder due to any inherent defect in a hiragana-only script, it's just about a lack of exposure to form those shortcuts that make reading feel easy.
- I made an appointment online for sports medicine. Doctor says I need an xray - I go across the street and provide my insurance card, all digital. Go back to the doctors office and he's got it on his computer already
- All contract with insurance (TK) is digital and per app
- Probably only applies in big cities, but in Berlin I have 1Gbps internet for 30 euro a month
TK works well (and I like it), but it's an exception in Germany. Most things are really analogue. Less than a year ago you still had to get a printed sick note.
The price of internet also seems to be a kind of promotion, with 1Gps costing twice as much on average.
That’s inflammatory and unnecessary. And you can hardly say “check the data” without reference. This kind of comment does not belong here or in any decent discussion.
I saw that and a car driving around with that on it in San Francisco and had the same thought. Who funds that and who is their real target audience? With the tech crowd you might need a bit more nuance.
I would basically never assume that the EU actually understand the technical parts of these regulations. On its face it sounds pro-consumer, but USB-C is hardly a unified experience and I am almost sympathetic with Apple here from a user experience perspective.
> The EU effort to standardize phone connectors led to nearly every phone manufacturer adopt micro usb and then usb c.
So, in other words, the EU forced one standard, which it is now abandoning (generating piles of micro-usb e-waste), and is now doing it again. And this is Apples' fault for not moving to USB-C in... 2012.
Didn't Apple use USB-A wall chargers with the 30-pin for phones at a time when everyone else was on proprietary power bricks?
> So, in other words, the EU forced one standard, which it is now abandoning (generating piles of micro-usb e-waste),
This is hyperbole. Especially because a common retort on these articles is “surely this standardisation kills innovation”.
A standard was adopted, preventing lots of e-waste. However, the old standard is now past its useful life, so a revised standard is being universally adopted.
This is the best of both worlds: a standard reduces the amount of duplicate adapters manufactured. However, it doesn't hold back progress and innovation when needs genuinely change.
If USB-C is mandated, there will never be a newer standard to replace it, because no company will invest any time or money into developing it as it would not be legal to ship it until it’s blessed by the government and it won’t be blessed until it’s got a critical mass which it can never have!
If the rest of the world mandated the Micro-USB standard at the same time as the EU previously, we would never even have USB-C! No one would have bothered to invent it.
USB-C doesn't have to stay mandated. Once every single device and charger is using USB-C, they can let the law expire and the market will keep it a de-facto standard because no one is going to buy a phone that doesn't have USB-C unless it has a lot of advantages. Notice how micro-USB is no longer mandated in the EU?
Also USB-C isn't mandated the only port on the device. Add multiple charging ports. Plenty of laptops do that, even a few phones. It'll make the transition easier for people too because they can continue to use their USB-C chargers if they decide the new technology isn't a necessity. Just like how plenty of commenter here want to keep using Lightning cables.
> If USB-C is mandated, there will never be a newer standard to replace it,
The law has a reevaluation period of five years. Basically lawmakers and industry representatives will convene every five years to see if a new connector should be mandated. We already know that this works, because the entire industry excluding Apple had been doing it from the day the EU first told everyone to clean up their mess "or else".
I have a single cable that can handle up to 100W of charging power and a single anker charger with usb-c port. I can charge my phone, headphones, laptops(macbook included), powerbanks, I can transfer data and power between my devices just by connecting them(want to charge a phone with another? Easy peasy, want to transfer data across devices? Easy peasy). And all this with a single usbc cable, so tell me again how this is not a unified experience compared to the need to bring with me 2/3 cables to charge macbook/ipad and iphone/airpods?
To be clear, I am pro USB-C, but I do get the arguments from the other side.
If I have a lightning cable and a device with a lightning port, I can plug it in and not really think much about it.
If I have a device with a USB-C port and a USB-C cable, I have no idea whether I can charge at full speed or even charge at all.
As an example, I plug my Xbox controller into my PC via USB-C. It keeps it fully charged. Recently I tried to plug in my USB-C headset into the same cable and it didn’t charge at all. In fact my headset died shortly after I unplugged it (after “charging” it all night), which was a bit of a pain. Turns out it can charge with my laptop charger or my iPad charger but not the Xbox cable plugged into my PC.
Maybe this isn’t a totally fair comparison, but it’s my personal experience.
Lightning "just works" only because it's all made and validated by Apple. If you want guaranteed compatibility, just keep buying Apple-branded Type C cables and adapters like you did for Lightning.
The only difference now is that you also have the option of using other cables and adapters if you want/need to, as well as use your Apple charging stuff for charging other things.
Yeah, I totally agree. The only annoyance I see is that you have to experiment with what cables and power adapters through trial and error to find what works with particular devices. I mentioned in a different reply in this thread that there ought to be some kind of consumer-friendly labeling on these things, so that you can see at a glance what will work with what.
You don't have to. You can. As the previous comment mentioned, nothing stops you from buying all the accessories and cables from Apple at a premium like you used to with lightning.
Funnily enough, this is why the only USB-C chargers and cables I carry around are my Apple ones - because they've been the only ones that I've been able to reliably trust across all my other devices (including USB-C Apple devices).
In almost every instance of stuff like that it is caused by the manufacturer not reading the specifications and implementing it improperly. Usually the cause is them trying to save literally $0.001 by leaving out a component they think is "optional".
The specification is very clear with regards to chargers: a charger with more watts is always better than one with fewer watts. For any pair of chargers, if charger A provides X watt and charger B provides Y watt, if Y > X every device which can be charged by A can also be charged by B. This means a laptop charger will always also charge your smartphone - although the opposite might not be true due to the laptop having a minimum power requirement.
That's really weird, but I remember in the past there were some older usb-a ports that did not send enough power to charge some stuff, perhaps your headset requires a higher current than the controller or something.
Here I can charge my laptop (although slowly) via the usb-c on my PC. All other devices work fine too.
What kind of piece-of-shit cheapest junk cables do you guys buy? Is it another 0.99$ Amazon/aliexpress cable that you expect handles 130W charging while transmitting 10GB/s or similar?
I have cca 10 different usbc devices, and use all their provided cables to charge all other accessories. All works as expected, charges fast, transfers fast. The key is to not cheapen out and buy the cheapest crap. Just like with anything else in life.
For low-speed (USB 2.0) devices, there are exactly three cables. One supports up to 60W, one supports up to 100W, and one supports up to 100W.
Considering all the devices you listed need less than 60W, they should charge with all cables. If they do not, the manufacturer gave you a broken cable which does not follow the USB-C specification. Blame the manufacturer, not USB-C.
The fact there are different cables is what makes USB-C stupid.
Do you think a mum and dad at home know the difference?
My dad had a USB-C cable for his android tablet. When I got him an iPad he tried using the same cable. It wouldn’t change. That makes the whole point of USB-C pointless
Defending the stupidness of USB-C cables just makes you a fanboy.
> The fact there are different cables is what makes USB-C stupid.
If you pretend you only need 60 watts, there is only one kind of cable.
> My dad had a USB-C cable for his android tablet. When I got him an iPad he tried using the same cable. It wouldn’t change. That makes the whole point of USB-C pointless
That wasn't because of multiple kinds of cable. That was defective equipment. They're not the same problem.
100% is the specs fault. It tries to solve all of life’s problems. But only creates more. There should be 1 spec for a cable. Period. It should be universal.
You mean like how there are different cables for HDMI, DisplayPort, or even a regular power cord?
The biggest problem is that manufacturers don't follow the spec. You are supposed to physically label the cable with its capabilities, but literally nobody bothers to do so.
Cos once you plug your tv you’re totally gonna be like oh now let’s use that hdmi/display port on this other tv in the other room. Now let’s use it on the monitor. Back to the other tv.
With the same Xbox cable but with a proper charger, does it work?
Most of the time, you avoid problems of "PD device hierarchy" by using a proper charger (i.e. a device that only has the functionality to charge other devices).
> Most of the time, you avoid problems of "PD device hierarchy" by using a proper charger (i.e. a device that only has the functionality to charge other devices).
The problem is with the "most", which means you can't count on it.
I have usb-c headphones that can be charged with my hp usb-c laptop charger (which is a "proper charger", I guess, since it only does charging) or from a regular computer usb port. They don't pretend to have any high-powered charge mode (manual says 3 hour charge time and "usb charging").
My usb-c ecig won't charge from that. It will only charge from either my pc's usb-a port or random "low power" usb phone charger.
I haven't tested it with any of the fancier high-powered adapters, since I don't own any. But, clearly, usb-c charging ports are not that universal.
Sure, as a technically inclined user, I can understand that.
It actually is a Chinese brand, but not some one-off brand bought off Ali Express. It's a device bought from a reputable store in France with a name that I've seen around for several years (Vaporesso). Sure, that fact, in and of itself, is not enough to guarantee that corners haven't been cut and that the product is actually up to spec.
But, as a random consumer, how am I to know that it's "a pretend USB-C"? It looks exactly the same as my headphones, comes with the exact same-looking cable. There are no markings on it. The usb-c ports on my laptop have a bolt icon next to them, but its usb-c charger doesn't have any marking. They also don't say it shouldn't be used with anything other than the laptop it came with, and actually is able to charge my headphones and my mom's usb-c phone.
Even though I've kinda followed the talks about PD, negotiation, etc. its still not clear to me why this particular combination doesn't work. I was under the impression that, lacking any negotiation, ports should default to the basic 5V 500mA. So my fat laptop charger should be able to at least trickle-charge the e-cig.
Because USB-C does not have a defined "upstream" and "downstream" side of the cable, a user could connect two chargers together. This is obviously a bad idea.
To prevent this, a USB-C charger is only allowed to provide power on the cable once it senses a downstream device on the other side. A legacy USB-A to USB-C cable always applies power, though, as this does not provide any danger.
Some low-quality brands think they are smart and leave out the two $0.001 resistors needed for the device to advertise itself. This means it will only work with a USB-A to USB-C cable, and not a real USB-C charger.
Using the Xbox controller cable with an actual charger does work. The confusion came with how the PC delivers power to the controller but not the headset.
It’s ultimately not a big deal, and it’s surely something I can learn to intuit. But I think lots of consumers (I think about my poor mother) would benefit from a clearer labeling scheme on the various devices, ports, chargers, or cables.
The general problem with USB-C is that the connectors support multiple different protocols, cables are allowed to not be of sufficient quality for all use-cases (the worst example is probably high-bitrate cases like cables for computer monitors) including high power charging. Similarly, the devices on either end needn’t support all the potential protocols/uses. Furthermore, many USB-C devices that are sold are low quality and can damage devices that are plugged into them by failing to follow the specification (eg delivering too much power). So, if you’re a consumer and you have two devices with USB-C ports, and a cable with USB-C on both ends, connecting the two together might either work fine, work with poor performance (without telling you this), not work, or damage your device. And there isn’t really a good way to tell this will happen.
Lightning isn’t that much better though. You can get lightning to USB-C cables so you can plug into various bad devices and have other USB-C problems, though you likely won’t try to plug into a monitor for example.
There’s maybe an argument to be made about the ports too. I think it would be harder to clean dust/fluff out of a USB-C port and maybe they could be more fragile too because of the spike in the middle of the port.
It still seems to me like Apple probably also like that they have a lot more control over lightning. And they already have USB-C on iPads and computers so they clearly don’t think it is totally terrible. But there are a lot more iPhones in the world than other Apple devices and I can imagine eg Apple having to spend a load of money on customer support, etc, due to the USB-C issues, or getting blamed when things go wrong because of the cable or device on the other end.
I think it would be better if the possibility of connecting two devices with a cable implied that they would work together, but I’m not sure how that could be done without either more ports or more expensive cables/charging bricks. And I don’t know why a USB-like organisation would have more success at ensuring things follow the standards than USB currently have.
"Furthermore, many USB-C devices that are sold are low quality and can damage devices that are plugged into them by failing to follow the specification (eg delivering too much power)."
The simple solution as done elsewwhere across many technical endeavors is to mandate standards, that's why the ISO exists.
For example, if I plug a 220/240 Volt appliance into a power wall socket that's rated at either 220 or 240V then it should work properly—one doesn't expect say 400V out of said socket.
That countries mandate a given voltage ± a specified tolerance that's well defined is specifically to avoid malfunctions/equipment damage.
A manufacturer that goes against mandated standards should suffer the consequences.
If other industries have no problem with mandated standards then why should the IT/computer industry be excepted or any different?
USB is standardised and yet here we are. It seems to me that somehow those standards aren’t sufficiently well mandated and I don’t really know how to improve the situation. Maybe there is some incentive for the people selling the low quality devices to move to higher quality devices over time (e.g. you get kicked off Amazon less) but that feels pretty weak to me (if better devices are more expensive, you won’t get any sales). Possibly figuring out how to punish Amazon for their third-party sellers selling faulty or non-compliant devices would help but it seems like something that would be pretty difficult to me.
Many standards owe their origins to some development that few used initially. Again using power as an example, that's how we arrived at the common 110/220V voltage standards we see in use around the world.
Due to common or widespread usage, a facility outgrows its origins and or its patents expire. Thus its original developer is no longer in charge—or can no longer fully manage its ongoing development, which is the current situation with USB (remember it's happened thousands of times before). Governments, though their standards bodies, then step in to protect consumers, etc. by ensuring standards are maintained (weights & measures are a classic case).
We are now seeing the first instances of this intervention process with USB. Unfortunately, the association responsible for USB has not kept up with the times (this usually happens because members cannot agree and the lowest common denominator becomes the released standard).
To make matters worse, USB got off to a bad start, it was a dog of a 'standard' from the outset, for starters it was ridiculously slow, 12mbps if I recall, which was about a fifth the speed it ought to have been given the then state of hardware performance. I recall being at the trade show when it was released and laughing at how slow this 'toy' was, we all joked about it. The trouble is that as this ill-conceived standard started from such a low base that it's never caught up, every release has always been too slow. Moreover, the mechanical aspect of the standard has always been inadequate, USB plugs and sockets are mechanically flimsy and slipshod and of inadequate design.
Unfortunately, as USB has become the widespread standard and seeing that industry has not managed its development well, it's little wonder government has stepped in. What we're now seeing here is just another instance of a well worn pattern.
Edit: just to be clear, if the standard had been managed well then Apple would not be able to offer Apple-specific enhancements as they would not comply with the specification. If the standard falls under a government authority (which is often an ISO), then adhering to the standard is mandatory. The advantage of this is that it puts all manufacturers on a level playing field. If this were the situation here then it would be unlawful for Apple to offer nonstandard USB enhancements.
Ehh, I personally thought USB-C was a huge mistake and was glad Apple brought back MagSafe on the new MBPs. The cable is higher quality than USB-C, the charge rate is faster, it’s less likely to knock the computer off a table if someone trips on it, and so on. The whole “we must force everything to be usb-c” obsession is so weird and it’s not always the best standard for every use case.
In theory, yes. The specification mandates that all USB-C cables must have 2.0 wires, but in practice some cheap brands have invented "charge-only" cables which lack them.