Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
iPad Pro (apple.com)
480 points by tambourine_man on Oct 30, 2018 | hide | past | favorite | 783 comments


To me, the shift to USB-C signifies a fundamental change in how Apple is approaching the iPad and computing going forward. Those stupid "what's a computer" ads make a little more sense. With USB-C, developers will support whatever I/O standards they need to in iOS, and the ecosystem of monitors, keyboards, and peripherals we all bought for our laptops/desktops will now begin to work with iOS too. This feature alone grows iOS into a much more powerful OS.

By this logic, it's pretty clear Apple is killing macOS x86 dependency (and macOS itself) in two ways. 1. Shifting compute intensive workloads to A level chips. This started with the small stuff around "security" and ApplePay but today the video encode/decode functionality in the T2 is a huge leap. This will extend into graphics next year (thank god cause fuck Intel's integrated graphics) with APIs like Metal and going forward almost anything I/O, Display, and every function outside of the CPU. #2 strategy is the iPad - by offering a compelling device that has all the features and capability of laptops without 2 decades of baggage.

We've all been so worried of the convergence of iOS and macOS, but after today I don't see a macOS future. Sure it will move to Apple's ARM chips, it will last another decade, but the shovel is out of the shed, it's called the T2.

(Jailbreakers, we need you)


The biggest pain for me of lack of access to the underlying OS has turned out not to be any of my thousands of tiny gripes with the way the software works that I can't change (e.g. setting dns on the cell connection), but the lack of transparency and ability to debug basic processes. What is the computer doing now? Why is it hanging? Why did that app crash? Why is my network failing? Who is that app talking to? My main fear is not, in the end, that I won't control or own my device (though I do feel this), but that it will turn computing from a transparent network of understandable processes—which I find beautiful, and which drew me deep into programming—into a Kafkaesque nightmare of inexplicably bad software and automated computer support, not just from apple, but from all the apps in the market they created.

I am absolutely sure that they will port XCode to the iPad itself and allow visibility into my own apps. This is not the same as a workstation.

EDIT: in retrospect, this is only tangentially related to parent post—my apologies. Also: wording, punctuation.


The reason Apple cares so much more about iOS than MacOS is vastly increased control. With iOS, they control the hardware from the chips up, the OS, which apps you can run, and via Safari-only access to the internet, even what you can and can't do using external services.

It makes computing & communications an Apple theme park, where you can't bring anything forbidden into the park, have to buy everything you need from Apple concessionaires, and can't do anything that they haven't planned for you to do. The people in the "park" pay to enter and then become a new type of domesticated herd to be milked.

You can't allow people to bring heavy equipment and power tools into a theme park, but you also can't build the park's controlled features without them, so how do you give power to authorized builders while keeping it away from users? In a physical theme park, they have controlled users during the day and empowered builders at night.

But in Appleworld, I'm thinking they'll keep iOS locked down by limiting the building tools to MacOS.

A unix-style workstation that gives maximum power to its users is so antithetical to everything Apple stands for that MacOS would never be created today, but since it already exists, they can take advantage of it to make iOS even more locked down than it would have been. (And refuse to let any other company use MacOS and not port Xcode anywhere else.) You want to see what your computer is doing (file sys, CPU procs, memory, network, etc.)? Do it with a Mac. iOS is not YOUR computer. It's Apple's. You only paid for admission to iOS, and you aren't allowed behind the locked doors. If you want control, you're in the wrong place.

I'm afraid that if they ever figure out how to sandbox an iOS partition of some sort to allow builder tools to run on iOS itself in a totally controlled way, then the Mac is toast, but the Mac gives them time. I think that for the foreseeable future, they won't risk any accidental empowerment of iOS users and will limit the power tools to the Mac.


The reason Apple cares so much more about iOS than MacOS is easily obtainable through earnings reports.


A decade ago, I never would have thought that Apple would be the ones mainstreaming Trusted Computing. (Well, the iPhone was out... maybe 12 years ago.) These were machines that supported the hacker ethos of tinkering and experimentation, even if the company involved was dedicated to abstracting more and more of the computer's guts away from the user.


You could crack the case and do stuff to an Apple ][. From day one the Mac has been hostile to this, right down to weird recessed screws.


There are lots of tools you can use to develop iOS apps other than Xcode. Phonegap/Cordova/OutSystems, Visual Studio, Xojo, Mendix, LiveCode et al.

End of the day, Apple like any other corporation does what it does because it makes money for them. The alternative is to use a "free" operating system like Android and you subject yourself to constant surveillance. It's like that scene in The Big Short; "tell me how you're f*ing me." At least with Apple, you know straight up what you're getting yourself into. Your relationship with Apple ends when you stop using their devices. Not so with Google/Facebook, where your data lives on in perpetuity, used for purposes beyond your control.

Now I'm not saying it's not possible to have your cake and eat it too. I'm saying, where is this mythical product, where a user is free to do whatever he wants with his device? What's the market size? How come no one has built it yet?


It's called a "hand-built Linux desktop."


Typing this on a netbook, the only surviving Linux desktop I still use.

For some reason most Linux conferences end up being about file systems, containers, device drivers and what not, seldom about desktop development.


Because conferences require money to run, and the kinds of people who back conferences tend to only run Linux on their servers, not their workstations. Enterprise Linux in general is happy enough being in the data center, and could have made harder pushes for the desktop for years now, but haven't.


Basically what Apple and Microsoft have found out, it suffices to offer a POSIX CLI for UNIX "desktop" users.


(Apple user disclaimer) Apple isn’t really offering just a CLI, it exposes the UNIX on which it is built. My understanding of Windows is that it just offers a runtime and abstraction layer.


Windows offers a complete personality for Linux syscalls.

Objective-C/Swift Frameworks have nothing to do with UNIX, Apple could easily port them to another kernel architecture.

Likewise OS X driver model has nothing to do with UNIX, being modeled on an C++ subset, which was originally written in Objective-C back in the NeXTSTEP days.

The only UNIX GUI certified as such is Motif, which Apple certainly isn't offering on their products.

Nor is the audio stack in any way related to UNIX.


I never claimed macOS was UNIX, but Darwin is certified UNIX[1]. Opening Terminal in macOS gives you a real bash without any of the rest of macOS.

1. https://www.opengroup.org/openbrand/register/brand3555.htm


And access to Bell Labs teletype style applications like a command shell, because everything else built on top isn't UNIX related, just like NeXTSTEP.


Yes, and the market size for that is…?

We don't all grow our own crops, make our own clothes, brew our own beer, refine our own crude oil etc, even though we may have the knowledge to do so. Some things are better left to specialists, due to efficiencies/legal reasons. Surely we can agree on this?


> How come no one has built it yet?

People are greedy and have no values.


1. It's not Safari only. It's Webkit only and there are legitimate security reasons for this.

2. The idea that UNIX is antithetical to what Apple stands for is just ridiculous. It is still at the core of iOS and MacOS and is one of the areas where Apple continues to innovate.

3. Apple licensed MacOS in the past. It nearly killed the company since third parties like PowerComputing went straight after their core base and did nothing to grow the ecosystem.

4. You can see what your iOS device is doing. Plenty of apps allow you to see what processes are running, file system behaviour etc. Apple just doesn't build it in.

5. Anyone who thinks the Mac is dead needs to go have a lie down with some chamomile tea. Apple even today doubled down on the Mac with the new Air + Mini. And they will continue to grow and invest in the platform since content creation will always be largely done on a Mac.


There are not legitimate security reasons for Webkit only. That makes zero sense. Chrome has hands down beat every other browser on security for years. Go read the CVEs. There is absolutely zero reason to believe that iOS webkit has a better track record than webkit at large. Therefore it's nearly certain your iOS device is less secure browsing the net with webkit/Safari than if Apple allowed real chrome to run there.


You misunderstand. Apple can fix WebKit bugs, but they can't force Google to fix Chrome bugs.

Also for Apple to allow v8, they would have to permit 3rd party unsigned executable code (JITed in this case). Apple doesn't want to allow that.

This is not about CVEs, it's a meta discussion of not outsourcing the security of the platform.


Which are both control issues not security issues.

There's no actual security reason for iOS to be locked to Apple's webkit exclusively and there's no actual security reason for iOS to not allow JIT'd code. Those are both control issues, not security ones. JIT in particular is purely a control issue - the process itself is already sandboxed which is the one and only actual security boundary here. Preventing a JIT doesn't prevent arbitrary code execution, after all, especially if there's an interpreter in play which Apple sort of allows.


Google has a track record of fixing Chrome bugs far far faster than Apple fixes Safari bugs so the point still stands. You'd be safer and more secure if allowed to run real Chrome on iOS than Safari by every measurable metric. In other words it's effectively provable the restriction is not about security.

> Apple doesn't want to allow that.

Yes, that's all it's about. Apple's control. Other excuses people make up for their reasons are demonstrably false.


> Also for Apple to allow v8, they would have to permit 3rd party unsigned executable code (JITed in this case). Apple doesn't want to allow that.

And how does that pertain to security? Android doesn't seem to have a problem with JITed code in sandboxed store apps. Neither does Win10.


Actually UWP does not allow for JIT code, other than Chackra, as you should know.

Hence MDIL in WP 8.x and .NET Native on WP 10 onwards.


WinRT sandbox has always allowed for JIT'ted code, all the way back to the original Windows 8 release - all .NET Store apps back then were running on a JIT. .NET Native is a later addition that is there solely to improve performance, and it is still opt-in.

Now, Win8.x did not allow for third-party JIT compilers in the sandbox; it was only CLR or Chakra. But UWP does - look for the "codeGeneration" capability here:

https://docs.microsoft.com/en-us/windows/uwp/packaging/app-c...


WP 8.x did not JIT code on device, hence the whole MDIL and cloud compiler on the store.

WP 8.x only did dynamic linking at installation time and when OS updates were done, by replacing symbolic labels with the actual target destinations. Everything else was already compiled at the store and downloaded as binary into the devices. This was the whole point of MDIL.

There is a BUILD session and a further Channel 9 deep dive interview showing how MDIL deployment works on WP 8.x.

So Chakra was the only JIT in town.


Since the comment does not allow for editing any longer.

"BUILD 2012, Deep Dive into the Kernel of .NET on Windows Phone 8"

https://channel9.msdn.com/Events/Build/2012/3-005

"Mani Ramaswamy and Peter Sollich: Inside Compiler in the Cloud and MDIL"

https://channel9.msdn.com/Shows/Going+Deep/Mani-Ramaswamy-an...


Oh, I meant Windows, not Windows Phone (for 8.x, that was a big difference still).

Either way, code generation is there today.


> You misunderstand. Apple can fix WebKit bugs, but they can't force Google to fix Chrome bugs.

If apple could make their browser the defacto browser of tomorrow in the way chrome is today then google would surely work harder to fix the pain points apple identifies


>4. You can see what your iOS device is doing. Plenty of apps allow you to see what processes are running, file system behaviour etc. Apple just doesn't build it in.

Do you have any examples? I've never seen anything like this in the app store.

Some apps give you limited access to their filesystem but (as a Jailbreaker) I'm pretty sure the larger system isn't viewable under normal circumstances.


Search for Lirum. Tons of apps like this.


Apple was always focused on their own platform, even Steve while at NeXT saw UNIX just as a way to lead developers into their hardware, but the real juice was in NeXTSTEP Objective-C world not POSIX.

What is happening now are those that only discovered the Mac world after OS X, don't care about Objective-C/Swift development, now feeling that their pretty UI Linux replacement is no more.

Apple naturally cares about their Objective-C/Swift developers in first place.


I do feel it odd to say content creation will be done on a desktop OS rather than a mobile OS when it seems that more and more people are using their mobile OS as a primary means of creation and sharing. Maybe in a less powerful way, but mobile OS apps and hardware have really democratized content creation and I don't see that trend reversing.


> "mobile OS apps and hardware have really democratized content creation"?

Democratized? Is that because you perceive mobile devices as more affordable? Have you checked the price of a new iPad Pro?


The majority of "new content" is being created with mobile OS apps today. Family photos, social media, etc. The fact that it's mostly all low quality junk is orthogonal to that reality.


Mobile systems have democratized content creation not because of price (though that too; not everyone buys the most expensive device) but because they are demonstrably easier to use for content creation than anything that came before them.

Source: personal experience with multiple over-65 people for whom an iPad or iPhone is the first internet-connected computer they've ever owned.


"Content creation" isn't usually referring to "family photos and social media". Besides, family photos are made with digital cameras, not mobile apps.

In the context of this thread, the target user of iPad Pro is more than someone taking pictures of their kids and pets on weekends.

As for my opinion on this, I believe in using the right tool for the job, AND using multiple tools to get the job done. Sometimes an iPad Pro might be fine for certain stages of a project, for sketching out backgrounds or initial ideas for a design, and then importing that into another system such as animation pipelines or other applications on desktop or laptop or workstations. Depends what the job is.

I've never met a professional designer who works only on their tablet. Professionals love their workstations, multiple screens, and all the comforts and power of a proper setup. If you're just marking up a PDF, I wouldn't count that as professional work!


> "Content creation" isn't usually referring to "family photos and social media".

I wish I agreed with you, but I can't but see that as elitist condescension against unsophisticated content.

> the target user of iPad Pro is more than someone taking pictures of their kids and pets on weekends.

I wish I agreed with you, but in reality the iPad Pro is marketed as a tool for "serious professionals" but in reality that's just an aspirational message and a large number of iPad Pro buyers will use them to take pictures of their kids and pets on weekends.

The same is true of DSLRs. They're supposed to be professional work tools but the overwhelming sales numbers are to amateurs taking photos for trivial reasons. (That said, the DSLR market has now matured to the point where they do target the amateru audience.)


>What is the computer doing now? Why is it hanging? Why did that app crash?

For what it's worth... On a computer, I'd look at log files. On an iPad, I'd plug it into my computer... then look at log files.


You can also go to Settings>Privacy>Analytics>Analytics Data and see some pretty specific logs on why things crashed or why they were slow. You can see if an app used too much CPU time, OOM kills (a.k.a. JetsamEvents, in iOS-lingo), and explanations for app crashes (I can see that iOS killed an app for trying to write to the photos library without permission, for instance).


and then realize the closed app you are using have none.


You have to keep in mind that apple is going to dogfood any potential "iPad as workstation" product with it's own developers. If their own developers jump out in rage then they'll fix it before release.


> If their own developers jump out in rage then they'll fix it before release.

That strikes me as very optimistic: the business needs come first.


Still, it's funny to see Google beat Apple to the punch here and make ChromeOS a valid dev machine for most people before Apple. USB-C doesn't let me run Docker, VSCode or IntelliJ. I'd love to bring an iPad pro for the weekend when I'm on call, but it's still not anywhere near.


"for most people" -> "for most javascript devs" ?

totally ridiculous dev machine for scientific or financial computing, no?


Anything you can run on Linux, you can run on a Chromebook, without even having to do anything hacky these days, as I understand it.

So, nothing to do with JavaScript specifically.


I think it was a reference to the capability of the hardware.


I do a lot of scientific computing and data analysis development work and most of the time during my dev cycle I'm working on tiny test data sets developing and testing new approaches to certain problems. Most of that work could absolutely be done on something with the capability of the new iPad pro. In fact I do a fair bit of development on a mid-range surface pro 4 which has pretty similar specs as the new iPad pro.


there's plenty of non-JS dev work that isn't scientific or financial. Since ChromeOS now runs Linux apps, any dev is feasible; arguments about machine specs are about the hardware, not the OS.


Last time I checked only Google devices do.


Nope; plenty of other manufacturers' Chromebooks now run Linux. Google's are certainly among the higher-spec'ed though.


I would take an iPad Pro with the overpriced keyboard and an Apple mouse. I think if they can jailbreak iOS 12 to support a Bluetooth mouse... we might be onto something.


You can always use something Amazon Workspaces or similar RDP services.

That's what I do when I need to do development on the go.


None of what I use a Mac for is available or practical on iOS. Atom, Git, bash, ssh, llvm, Python, R, PostreSQL, CouchDB, Docker, nginx, Node.js, homebrew, curl, wget, ffmpeg...

I’m typing this on an iPad now. I mainly use it for web surfing, Netflix, email and casual gaming.


All of that can be done in a remote VM. I use Coda + Shelly on an iPad pro and it's a pretty decent development experience. The real barrier is for (non-web) GUI development.


What's the point of having an A12X chip while running everything remotely?


The UI is absolutely smooth unlike any laptop!

Coda has a local server/browser pair you can use for simpler things. But usually the kind of web work I'm doing barely exercises a single core, it all runs on a $2.50/month VM.


Remote VM is an interesting idea, what about mouse support?


You can use swiftpoint on the jump desktop app (that is my setup), it’s highly portable. Don’t get the apple ipad keyboard though, no escape key...


out of that entire list, the only thing that called for a mouse was atom

maybe we're returning to the age of terminal-only development ;)

but with vector font rendering!


Actually there were terminals with vector fonts in the '60s https://www.cca.org/vector/


When you're starting from a change in plugs and end up predicting shifts in CPU architecture ten years down the road, there's a pretty high chance it's just fitting noise into your pre-existing narrative.

FWIW: the death of the Mac has been predicted constantly since at least the original iPhone came out. If it takes another decade to happen, at least we can lay these accusations to rest that publicly traded companies are forced into short-termism.


I hope iOS 13 will have mouse support for the iPad Pro.

Only being able to touch the screen while using the iPad Pro in laptop mode feels really limiting.


It's not only the fact that a finger is so much less precise than a cursor, but also that UIs made for fat fingers take up a lot of space. Buttons are bigger, when dragging you need to know what's going on under your finger, etc.

I also think that the lack of a desktop UI and a mouse is a huge limiting factor of the iPad Pro.


Also, sometimes you are using a UI not designed for touch (e.g., remote desktop).


At least AWS has solved that problem. The iPad Workspace client supports a Bluetooth mouse. I wish it were supported system wide.

https://docs.aws.amazon.com/workspaces/latest/userguide/amaz...


You can do mouselike pointer selection via the on-screen keyboard: a force press on iOS devices that support it, else two fingers together: your keyboard is now a mouse pad.

I"m not much of a mouse user so this is adequate for me.


I don't know why this factual comment would be voted down. It is functionality built into the OS to produce mouse cursor motion.

Not like my life depends on a karma point; this is simply puzzling.


Because it’s a horrifically limited “mouselike cursor”: it’s entirely bound to inside a text entry field.

It achieves nothing of what people actually what mouse cursors for.


Okay, then say that. Don't downvote, or if you have to downvote, justify it then and there.

BTW, people use the mouse to select and change the cursor position all the time, so it's very relevant to the topic since it's a step toward desktop-like mouse support.


I always forget about it but that is a great feature, especially since it's nearly impossible for me to modify the start of a url in safari since my finger bumps up against the edge of the case.


All that horsepower, no accurate steering wheel.

Is it a pro device if the choice to add a mouse isn't there? Many pros work in pixel accurate selection instead of pencil or finger.


"Pixel-accurate" might be an obsolete term today when you have 9 or more pixels where previously there was 1. (I know, that isn't what you meant ;-)

See how Autocad was using a command line for ages to implement precise control over coordinates. Don't need a mouse for that. At the same time, the imprecise things that you use mouse for (e.g. dropping a symbol from a library into a workspace, or connecting controls with outlets in an interface builder) are much easier to do using a direct touch.


Obsolete or not, it was clear what I meant. :)

A mouse has a lot of reasonable functions that are quicker for selection and editing that touch doesn't come near. Touch has its own benefits.


Is pixel level selection an issue when the UI immediately supports an arbitrary zoom level? (Pixel grows to the size of your palm).


The iPad UI is designed around touch, so generally not in those cases.

Add a few select and very valuable uses (remote shell, remote access, desktop/graphic layout) and a mouse is much faster.


It's a "Pro" device because Apple used a flimsy keyboard that doesn't allow the possibility of actual lap-top use.


You can buy other keyboards... it's the specs that make it Pro.


Can you get a USB-C keyboard? BT keyboards are generally painful to use


I don't see why a USB-C to USB-A adapter wouldn't work for a keyboard.

But yes, it does look like it's a product you can actually buy

https://www.amazon.co.uk/Macally-UCKEYE-Keyboard-12-inch-Com...


Yup Logitech makes a good keyboard.


I wonder when they're just going to start calling it "iOS".


apple: we won't give you touch screen laptops because touchscreens are garbage

also apple: here's a laptop with only touchscreen.

fan boys are going to have a field day to sell this one up. (edit: it already started with a workarounds for text input only. let's see how far it will go :)


Touchscreens are garbage on a desktop-oriented OS like macOS. The giant-ass buttons and controls in Windows 10 are one of the things that make it so infuriating for me. I don't want big buttons sucking up screen real-estate everywhere when I have a mouse available for precision input.


No, they said they're unviable on laptops because of having to lift your hands up. Now they're pushing the iPad Pro with a keyboard that props it up to stand vertically, without a touchpad, requiring the user to lift their hands up more often than, say, a Surface.


UX design in the past 10 years (since the iPhone launched) has incidentally been precisely in the direction of giant buttons everywhere.


I don't get what macOS vs iOS has to do with processor architectures or the T2.

Why don't you see a macOS future? It can use the T2 just as well as iOS, no?


> developers will support whatever I/O standards they need to in iOS, and the ecosystem of monitors, keyboards, and peripherals...

What are you talking about? Apple won't allow developers to do any such thing. They only recently allowed a narrow range of NFC uses even though the hardware itself has been there forever. You think they're going to let people develop your own kernel drivers? Maybe with a $10,000 "hardware developer" account.


Jailbreaking will never be mainstream when people worry about the warranty on their 1k-2k+ devices. If you want people to own their own devices you will have to write it into law presumably in a more consumer friendly jurisdiction than the US which cares about business first last and only.


I'm aware the jailbreak community often felt Apple was targeting them specifically, and perhaps they were, but every jailbreak was also exposing a security flaw. Personally, I've never been worried about jailbreaking voiding my warranty -- I just never got sufficient value from it to make it worth the drawbacks. (Both value and drawbacks are subjective measures, obviously.)

While I understand the "you must own your own device" line, that's more of a rallying cry than a useful description. (I have a similar pedantic complaint with "if you're not the customer, you're the product," for what it's worth.) It's interesting to think about a regulatory framework that mandates companies build software-driven devices with the ability for end users to do their own software loads, but figuring out how to do that in a way that doesn't let companies wiggle out of it and doesn't essentially mandate insecure back doors in all "smart" consumer electronic devices seems to me to be a non-trivial problem.


Back in the day (iOS 3-4 ish?) you used to be able to install Cydia packages that would patch whatever exploit the Jailbreak originally used.

This isn't the case anymore, unfortunately, because the Jailbreak community is smaller.


I remember some of that. I didn't really get into the jailbreak scene too deeply, though. Usually, what happened is I heard that some Thing I wanted to do that you couldn't on iOS was available if it was jailbroken -- "tethering," for instance, back before that was added to iOS. I'd jailbreak the phone (which could be really easy or involve a lot of hoops to jump through, depending), and install Cydia, and install the App That Did The Thing. By that point I was usually already frustrated because the whole user experience of the software from the jailbreak side of things tended to be noticeably worse. (I'm sure there were exceptions, but in general, that was consistently my experience.) Then I would discover that the App That Did The Thing was a little buggy and not very well designed, and sometimes didn't even consistently Do The Thing, and I would look around the Cydia store and discover that there was very little other stuff that I actually wanted, and then something would crash in a way that took the whole phone down in a way that official iOS apps very, very rarely do, and I would say "enough of this" and put the phone back in jail.

tl;dr: I tried to be adventurous, but was too impatient. :)


Letting the user control the device inevitably means letting the user hurt themselves that is inevitable and impossible to fix.

The least intrusive method is too allow users to add additional software sources and either opt out of sandboxing wholly or totally for select vetted apps.

This isnt hard it would just hurt Apples ability to get a 30% cut as third party stores would spring up.

User owned personal computing devices have been a thing for decades can we not pretend this is uncharted territory?


The comment of yours that I replied to explicitly mentioned jailbreaking, which is not the same thing as "additional software sources." I'd like to see Apple allow users to install signed apps from places other than the App Store, and I suspect that protecting revenue is the main reason they don't. But I'm not particularly interested in jailbreaking, even though I'd like to see functionality that iOS doesn't currently allow.


Well technically you can do that. Download code, build in Xcode, and install to your device. Probably not what you meant, but it does enable some side loading.


Further, there’s no theorem that jailbreaking will always be possible. I get the impression it’s becoming harder over time.


Jailbreaking has never violated the warranty. Apple just doesn’t support it so they won’t service your device until you update it to the latest software, which is completely reasonable.


It's completely reasonable that upgrading your phone and maintaining administrator access shouldn't be a challenge.


I.e. violates half of your warranty, because any but the most obviously mechanical problem can be blamed on messing with software.


But the first thing they'd do even if you didn't jailbreak would be to restore your device and pull a backup from iCloud. If you want to keep your jailbreak, then they can't help you, because who knows if the latest version of iOS included a patch for your problem, a tweak you installed borked the phone, or if Apple borked it?


And as far as I’ve seen when it comes to jailbreaks, anything but the most obviously mechanical problem is caused by the jailbreak.


I would read your articles. Do you blog somewhere? Link, please.


i disagree in that i do see a future for macOS. steve jobs said it best - sometimes you drive a convertible, sometimes a pickup truck.

macOS may be the pickup truck. but it isn't going to go away without a fight. not unless you can make software engineering work on an iPad the same way it does on macOS today.


A shift from lightning to usb c is apple losing because they didnt donate their port to the spec. They could have "invented" usb c. I understand why they didnt, but I still prefer lightning to usbc.

They are doing this before the EU makes them do it. What apple did do was force the market to adapt to a plug that can go in both directions. And they thinned out the plug quite nicely, it never needed to be so thick.


Lightning is awful. It looks nice, but that’s because all the fiddly bits that are prone to failure are hidden in the port. Ever had your iPhone’s Lightning port fail? It’s not particularly rare, and Apple can’t fix it without replacing the whole phone.

USB-C puts the parts that wear out quickly in the cable where they belong.


> Ever had your iPhone’s Lightning port fail? It’s not particularly rare

Really? In another life I worked at an apple store genius bar, and can't recall ever seeing a failed lightning port (physical damage aside). You know what's extremely common though? Schmutz. The port picks up pocket lint, and then the connector packs it all down to the bottom of the port. Eventually your phone stops charging. Often the gunk is packed so tightly it's not obvious even looking with a flashlight that there's anything in there, or that the connector isn't sitting properly. Just dig it out with a pin and you're fine.

Obviously I don't know what happened with your phone or etc. All I can say is in my experience (which is sadly extensive), if you had your phone replaced for a failed lightning port the real problem was almost always schmutz + incompetent tech support.

Edit: Didn't notice jen729w's comment when I posted. So... seconded.


> Often the gunk is packed so tightly it's not obvious even looking with a flashlight that there's anything in there, or that the connector isn't sitting properly. Just dig it out with a pin and you're fine.

That's a great tip - it never occurred to me and I've just been living with a unreliable connection for months.


My concern with USB-C is lint. My iPhone’s Lightning port regularly collects pocket lint, which I fish out with a toothpick. That’s easy to do.

(I notice the lint in there because the Lightning cable doesn’t quite snap in all the way; each time I’ve pushed it in, I’m compressing the lint up at the back of the port, and eventually it gets too much and prevents a solid connection. Only happens about once a year, but it happens.)

Looking at a USB-C port, there’s still space for lint but much less space to get in and remove it. Can anyone with a USB-C phone share their experience?


> Looking at a USB-C port, there’s still space for lint but much less space to get in and remove it. Can anyone with a USB-C phone share their experience?

You anticipated my experience exactly. My iPhone 6 got lint impacted as much as my Pixel, but it was a lot easier to fix the iPhone. Lightning does seem to require a more solid connection. USB-C will work longer with a "partial connection", which really just means I'm going to wait longer to clean it out.

Edit: I've never gotten any lint inside any of my USB-C cables though. Looking at the cable in front of me, it would be a bitch to clean out.


I have a usb c jack on my Android and haven't had any lint woes.


I didn't know that, so I'm glad you mentioned it. I have had at least 20 failed cables and 0 failed ports (of maybe five or six devices) in my experience with Lightning.

Actually, I'm not glad you mentioned it since that makes it even more perplexing. I was glad because if a port ever failed I would not have easily diagnosed it because I thought they were pretty good. But it must be my fault the cables are failing....


My phones seem to eat cables, at the rate of about one a year, per device, but the ports last the lifetime of the device - which is a huge improvement over microusb.


I’ve had two ports fail.

Another issue with lightning: the contacts arc. The middle pin on any non-brand-new cable is almost always a bit burnt — it’s pretty easy to see. I don’t know whether USB-C has mitigations for this.


buy cables with kevlar in them. i no longer eat cables per year.


Huh? USB-C has that flimsy little tab on the port whereas Lightning is... just a hole.

I seriously don't understand the logic here. I've never had a port fail on me.


They may not have invented it, but 18 of the engineers on the certification work for Apple and they were the first laptop manufacturer to announce support. They’ve pushed heavily and quickly into USB-C so I really don’t see any signs of coercion.


Why do you prefer Lightning to USB-C? Just curious.


I prefer it because it feels way better. The snap, and the way the plug feels secure in the socket.... I had a nexus 6p for a year and the plug was so damn unreliable and wiggly. Bought multiple different types but they were all bad. That said, I haven't used any Apple devices with USBC and I would not be surprised if they felt much better.


I have an iPhone and a MacBook with USB-C ports, and my experience has been the exact opposite. The lightning port on my iPhone is really weak, all cords can way too easily lose a connection so my music stops or it stops charging. The USB-C on my laptop snaps in really nicely, I'm never worried about the cable not being fully connected.


Sounds like you’ve got some lint in your iPhone’s charging port. I recently used a cocktail stick to remove a big lump of it from my wife’s 6s where she had the same issues.

Cable snaps in nicely now.


Yeah that's what I was originally hoping, but I've tried cleaning it out a few times with no luck.


Two of the usb-c ports on my 2016 MBP has been loose for a while


I understand what you mean about the feel but lightning ports have a fundamental physical flaw that is part of its design. The pins are on the iOS device side so if they wear down or break that device needs to be repaired. With USBC, the pins are on the cable side so any damage to them is just a simple cable replacement.


Is this an actual problem? In all my experience I've seen cables fall apart (mainly USB Micro or HDMI) but beyond lint, never had any issues relating to the actual ports.

The lightning contacts are flush with the port, so very unlikely to get damaged. On the other hand the USB-C has that fragile looking wafer on the device-side.


I agree. I will often knock out the USB-C cable from my MacBook when it's on my lap causing it to stop charging, whereas I can hang my phone from a lightning cable.


For usb-c, this depends hugely.

I've had phones were the plug felt very weak in the socket, and devices with a very satisfying snap and a very strong connection.

This is to say that usb-c is not inherently bad, it depends a lot on the hardware (and I'm sure Apple will get it right)


They didn't on the MBP - some of my dongles and cables even with high end equipment lose connections easily.


I've had a 6p for ~2.5 years now and only occasionally have troubles, and then only with certain plugs.


the snap. and things cant get IN the cable port.


Are you sure Apple didn't invent USB-C? There are sources suggesting they did, like https://9to5mac.com/2015/03/14/apple-invent-usb-type-c/


The simpler explanation is that Apple simply wants to move to USB C because they see benefit in being on the "standard" mobile device port now that it does not have marked drawbacks vs their proprietary solution and can used for their entire line of mobile devices, including laptops


It's kind of amazing how much the lack of a real Chrome (not just a wrapper around WKWebView) prevents me from caring about this.

I suspect the drawing experience on here is better than with a ChromeOS device, but I care more about my web browsing/creation experience than I do about sketching.


What is it about Chrome's renderer that specifically appeals to you?

I agree it's somewhat ridiculous that you can't get (real) third party web browsers, but Apple's is so damn good that I don't consider it a real problem.


....


Do you mean the extension cable that came with the original Apple USB keyboard?

It was only for the extension cable -- which are prohibited by the USB spec, so it was technically a "captive" extension. (The USB specs define captive as having a non-standard-USB plug, even if it's not physically captive.)

Blame the USB spec, not Apple. They're actually the only company I've ever seen make an extension cable that meets the letter of the spec.


>Apparently you're not old enough to remember those apple "USB" keyboard plugs with the little v-shaped indentations so you couldn't plug in non-apple approved devices.

Wasn't that because of the extender cable? As I recall extenders couldn't be USB compliant, so apple made a notch in the extender so that it wasn't technically a separate USB accessory from the keyboard?


The keyboards themselves didn't have the V, it was the extension cord, and it was because it was out of USB spec. If you push hard enough, any old USB port went in just fine.


Seems like it'll be really difficult for them to ever make the iPad suitable for software development while also preventing users from installing software except through the app store. But if they ever figure that out I'll probably replace my laptop with an iPad.


I wish they'd make the iPad more development-friendly. I don't know exactly what that looks like… but if they did that, I'd use it as a daily driver in a heartbeat.

So so so done with Apple laptops, though. It seems like the butterfly keyboards are here to stay, and they're just plain awful.


"The new Liquid Retina display goes from edge to edge"

I've never owned an iPad, perhaps I'm missing something here. How does the above statement reconcile with the pictures I see on the page where there's clearly a black bezel?

Do they just mean there's not an area at the bottom with a physical button?


Am I the only one who actually like to have an edge on my devices so they dont break by looking at them? At least if there is a metal edge it can take some of the force and you have a chance of not breaking the screen. I dont understand why metal is bad and glass is somehow good? Also, I need to grab my ipad, move it around, spin it, grab it with the other hand etc. I see the edges / bezel's as very useful in these cases.


You're not the only one. I also like a bit of a bezel or "chin" on my phones, because it makes holding the device with one hand for e.g. taking photos much more secure.


My 75 year old mom has real trouble trying to use her ipad mini, especially with taking pictures. Inevitably the left hand trying to hold the ipad has a thumb on the touch screen. Then when she tries to do anything else with right hand it is registered as a pinch.

She also had a career in graphic design and now teaches oil/acrylic painting. I know from her, the art world likes frames. A bezel is not a bad thing.


Not to take away from the larger concern, but in your mother's case, perhaps a case with a significant border would help.

I suppose if there is not a ready-made one that suits, making /hacking one, the old-fashioned way or through 3D printing, would be an option.

One reason I put a case on every one of my cell phones, is to give me something more substantial to hold on to. The thin bodies and edges start to become a detriment; all the more so in that I have no interest in squeezing mine into the pocket of skinny jeans.


Thank you for your suggestion. My mother has declined adding a case because she carries the ipad mini in her purse when she is out. She doesn't want the additional bulk. She knows she is "not holding it properly" and moves her thumb if it's not responding to her.


I've seen people using finger loops that are stuck onto the back, or straps. I managed to find a decent looking instance of the latter in a quick search (so, just the first I found):

https://www.amazon.com/TFY-Security-Holder-Finger-Tablet/dp/...

Cheers


I think iPhones have for a long time been designed with the assumption that every user will add a case.


That's funny... I specifically don't use a case with my iPhone because I feel they have been designed as they are to work without a case.


Iirc, many years ago Apple noticed the majority of customers liked cases (due to things like personalization as well as safety), and started designing iPhones with cases in mind. They're still meant to be pretty when naked, but I believe majority of people are using cases. This is why things like the camera bump have made an appearance but are really not an issue, since cases flatten those lines out.


I wish I could use it without a case but the thing is so damn slippery without one. Slides out of my pocket or around when sitting on the couch and it's on a slight incline.


I fee like if Apple truly didn’t believe they needed a case, they wouldn’t make and sell them.


So if everybody is going to use a case anyway, why isn't the functionality of the case just built into the phone? This applies not just to apple by the way. I've never understood this.


Personalization. People love to express themselves with different cases and having the device even a several mm thicker would make most of the customers look at in disgust, because then with the personalized cases it would by huge. And trust me, outside of tech community, no one wants thick phones.


I'd much rather an external 3rd party case. When my $20 case gets worn or damaged I can spend $20 on a new one.

If that was built into the phone it would be much more expensive to replace, and I'd end up just putting another, slightly bigger $20 case around it to prevent that from happening.


If Apple believed the iphone required a separate case, they would build the iphone like a raspberry pi so users can build their own lego case :). Or build the iphone with plastic because it's cheap and light instead of metal and glass.


iOS is generally supposed to ignore touches on the edge of the screen, but it doesn't always work. I guess this is one case where it isn't really functioning as expected.


Honestly I think that can be fixed quite easily with a phone case. I actually prefer a case on my XS because it's so damned slippery if you lay it on anything with a slight slope it's falling off, and it's easier to hold.


Eww you're touching your devices. Your only allowed to swipe.


Right?! I do not understand the crusade to remove all bezels.


It's a key part of the bigger plan to remove thickness, bezels, ports and functionality.


And battery life.


"Edge to edge glass" doesn't mean there's literally no physical edge banding around the screen. The iPhone X models with their "all-screen design" still have a stainless steel edge band. The iPad pro clearly still has a metal edge.

The device you seem to be imagining where the glass runs all the way to the edge with no edge banding doesn't exist, and would be an ugly and unpleasant device if it did, as the edges would reveal the electronic innards (and heaps of glue).


Don’t some high end Android or Samsung models have edge to edge phone displays?


Sure. They still have an edge band (or a back that comes around to act as an edge band). There are no phones with screens that simply stop in a raw cut edge.


Whether you like the edge or not, that's not the question your parent asked.


Very much agreed. Plus, it looks better. Compare e.g. my phone [1] with its symmetrical shape, with "chin" abominations like half of new phones being released.

[1]: https://www.gsmarena.com/lenovo_zuk_z2-pictures-8125.php


I agree. I can't say removing the bezel is something that wanted or asked for.


Why not just put a case on it?


Billions of dollars in research and industrial design, thousands of man-hours sourcing the perfect materials, hundreds of thousands of hours of engineering that give a device a most perfect feel.

Only to be placed in a $10 case with bunny ears so it won't crack if you accidentally drop it.


Exactly!

Slim sleack shiny phones which break when dropped. Easily dropped because of shiny slippery surface.

What to do? Put some cheap something around it.....

And I'm wondering why people ask me why I don't have a case around my phone.

I don't get the thin phone to have more space for the plastic protection that's why...


Frankly I think it makes perfect sense that a phone come without protection built in so that I can swap out the protection myself and, after a couple years, I'm not stuck with a banged up device.

This doesn't seem weird or contradictory to me at all. Nothing seems cheaper to me than my girlfriend's Android phone with rubber bumpers built in. Meanwhile my iPhones with cases look brand new when I'm ready to sell them.


I agree with you. I was just saying that if you want a rugged phone, then you can accomplish that with a case.


I'd prefer that the ruggedizing were built into the phone itself, they could probably space components out a bit better for heat dissipation and add maybe add a second port.

It is sort of hilarious that miniaturization has gotten so far that we purposefully de-miniaturize miniaturized goods.


$10 case? Brother if I drop $1,899 on a tablet it's going in a $130 Otterbox Defender and I'll consider it money well spent.


I hate the bezelless trend. And quite frankly I hate the trend of making devices thin. Now I just need to add depth so I can hold it and use it comfortably without worries about dropping the damn thing.


Can't you solve both problems by just using a case?


without a bezel there is nothing for the case to wrap around.


Yeah I have to spend extra money for a case and I do. Instead of proper bezels and extra battery life. Instead I get a device that sucks at Palm detection. But it's super thin.


Think of the extra battery that could be added in...


Indeed. People do just love hearing words...

Presenter: “This display goes from its start...”

(Audience gasps)

Presenter: “...all the way until it ends! We call this ‘entire’.”

Audience: <mad applause>

Audience Member 1: (whispering) “I can’t wait until I can get a device with an entire display. I’ll have to put in some OT at the job to save up, though.”

Audience Member 2: “What do you do?”

AM1: “I work at a company that specializes in digging half-holes.”


This is my favorite response so far. I don't really have much interest in the bezel or not debate. I just didn't get "edge to edge screen" as a good description of what was being sold. By that definition, all displays are edge to edge unless you deliberately obscure some portion of the screen.


"I have wit. I have charm. I have brains. I have legs that go all the way down to the floor, my friend."


Well, I guess you could say the display is edge-to-edge because it goes from the edge of one bezel to the edge of the opposite bezel... :)


Yeah pretty much. The bezels are smaller than on current ipads, but yeah "edge to edge" is definitely a stretch.


You mean a "lie." Calling a notched iPhone "all-screen" is a bit of a stretch. This is just straight upfalse.


Forgive the humor in a HN comment, but because of you I shall add "upfalse" to my working lexicon.


It's a plusgood word.


There should be room for "infinity screen" down the line too


"edge to edge" is definitely a stretch.

Ha!


I guess Android phones were "edge to edge" when they first got onscreen buttons :)


But they were not "magical".


This is because the glass is edge to edge, even if the pixels aren't. (so 'technically' correct, but certainly misleading)


Pretty much all iPads ever had had this feature, but I find this advancement super innovative!


I own many Apple products, and I like Apple as a company a lot; but I would have to agree with your observation. The reality does not agree with their marketing. I understand marketing can be employed as a tool to favorably present reality, but still, the edge-to-edge statement is quite a stretch.


Bezels are smaller, and consistent all the way around, with no ‘chin’ or ‘forehead’ bars.


But they aren't even small bezels. There are still rather large bezels all around the "edge to edge" screen.

This is just a blatant lie by Apple's marketing.


Their marketing begins and ends with large, high quality photographs of the product which clearly shows the bezel size. Nobody is going to be misled.

Reserve your criticism for companies like LG who actually lied about their monitors:

https://www.flatpanelshd.com/news.php?subaction=showfull&id=...


There's not a limit for criticism. Just because LG was worse in a particular instances doesn't mean Apple gets a free pass.


My point is nobody is being deceived and Apple clearly isn't trying to be deceptive. "Edge to edge" as a term has been used to describe many things, and never the mythical 0.00mm bezel that only a pedant would assume.

But sure, there's no limit to criticism.

Everything is horrible.

Buy nothing.


How is calling a 9mm bezel "edge to edge" not being deceptive? It's not even a small bezel by monitor standards much less mobile ones. This isn't at all about a mythical 0.00mm bezel (which isn't actually mythical - those devices do exist), it's about not being in the same ballpark as what is commonly referred to as "edge to edge" or "all screen", which is in the ~4mm or less range range.

How can you possibly look at what Apple is claiming and then look at the pictures and say the words aren't deceptive?


It's not deceptive because they show you the picture first. They have the most beautiful high resolution product photography that leaves any potential buyer in no doubt about the bezel proportions. There is absolutely no opportunity for deception to occur, unless the buyer is a complete fucking idiot who thinks the real thing is going to be better than the photographs because they also read a few words of marketing hyperbole.


Yes, in this case it seems to be a euphemism for “we junked the Back button”.


I think this essentially means the bezels are the same width the entire way round.


It isn't liquid either. False advertising!


It's like saying 'lightning fast', when in reality it could be faster or slower than lightning, it's a metaphor that does well in marketing and doesn't leave you open to legal challenges as much as a more technical reference (e.g. 'zero bezels) would, but gets the point across that bezels are thinner than usual.


> it's a metaphor

No, "from edge to edge" and "all-screen design" are not metaphors but as concrete as it gets, and obviously wrong. This iPad may be great but these marketing texts are just straight lies.


You don't seem to understand what a metaphor means.

If I say it's fast as lightning, I'm not saying it's exactly that fast. It's a metaphor for fast.

The fact the speed of lightning is measurable and 'as concrete as it gets' doesn't change the fact that people use it as a metaphor.

You may feel the use of 'edge to edge' is an inappropriate metaphor that's fine, not every metaphor is fitting and, hell I'd even agree with you if you said that.

If you ask me, you can start saying edge-to-edge when the edge is at least 50-80% thinner than what it is now. But it's a metaphor nonetheless, and I'd feel fine applying it to extremely thin bezels which are not actually edge to edge. If you'd have said such a thin bezel was 'razor-thin' or 'paper-thin' when it really wasn't, I'd say it's still a perfectly fine metaphor.


Agreed that 'fast as lightning' rarely refers to the literal speed of electricity in atmosphere; it's commonly understood to mean 'very fast'.

However, there is no such common understanding of the phrase 'edge to edge' meaning almost edge to edge. Edge to edge in English means it goes from the edge of one side to the edge of the other, literally.

It's not a metaphor. Only Jobs had a powerful enough reality warp to make that fly…


You're being silly, it is a metaphor and it's been used before Apple ran with it. The Dell XPS was often said to have an edge to edge display. [0] It didn't, literally, the bezels are similar to this ipad.

Because in the context of displays there isn't really any big consumer product out there that actually has no bezel at all. Rather you have all these nearly edge to edge displays. And we talk about them as being edge to edge, razor thin bezels etc. There's no consumer out there who thinks he's getting a display without bezels. Everyone gets the metaphor.

> It's not a metaphor.

If you don't think it's a metaphor you'd have to believe that Apple truly thinks this device has no bezels, that consumers typically think it has no bezels, that journalists using these descriptions think they have no bezels, because they all take 'edge to edge' as being literal, rather than metaphorical. And that's simply not true. It is a metaphor, whether you (or I) think it's an appropriate one or not.

[0] https://www.theverge.com/2015/10/8/9476199/dell-xps-15-2015-...


Dell doesn't call it edge-to-edge or all-screen in their marketing.

Anyway looking at the dimensions of the iPad Pro 12.9" model it's 11.04 inches wide with a 12.9" 4:3 screen. The screen itself is going to be 10.32" wide as a result, putting the bezel at .36" or around 9mm. By comparison the XPS 13's bezel is 5.2mm.


Dell does call it edge-to-edge in their marketing - https://www.dell.com/en-my/work/shop/cty/pdp/spd/xps-13-9365... - "Real wonder. The revolutionary InfinityEdge display is now available for the first time on a 2-in-1, providing a virtually borderless edge-to-edge view, all in the smallest 13-inch 2-in-1 on the planet*." Note that they talk about the borders as being outside the edge-to-edge part.

Lenovo also use it on this desktop with bezels the size of the moon[1]: https://www.lenovo.com/us/en/desktops/lenovo/b-series/b50-30... saying it has a 'vivid 23.8" edge-to-edge display'

Microsoft use it to describe the Surface Laptop here - https://www.microsoft.com/en-us/p/surface-laptop-1st-gen/90f... - "Enjoy more space for your ideas with an edge-to-edge display and ultra-thin bezel." - specifically calling out the edge-to-edge display as being a separate thing from the bezels as well.

HP describe this desktop iMac ripoff with huge bezels as "edge-to-edge" here: http://www8.hp.com/h20195/V2/GetPDF.aspx/c06002849 - "An entertainment sensation; Sit back and enjoy a captivating entertainment experience. Elevate every stream, video chat, and photo with an edge-to-edge up to QHD display"

Acer uses it to describe a laptop in 2012: https://www.acer.com/ac/en/US/press/2012/50215 - "Featuring a 10-point touch edge-to-edge display and a larger trackpad, the Aspire V5-471P and V5-571P are designed to enhance multi-gesture content" - and it's a 2012 laptop, it has bezels.

The claim that only Apple use it and it's a lie and nobody else uses it in their marketing is nonsense.

[1] I mean literally the size of the moon, because nobody is allowed to use words differently without your approval, of course.


LG were perhaps the worst offender—they didn't just use jargon, they faked the measurements and doctored the press photos.

https://www.flatpanelshd.com/news.php?subaction=showfull&id=...


They call it an InfintyEdge display...


Ugh, I hate the way Dell lies! That's obviously not a metaphor.

Also, whenever journalists and commentators write about Dell screens as being edge to edge, years before Apple, they're not lies, they're metaphors.

It only becomes a literal statement that is obviously a lie, when Apple uses it.

/s

To everyone downvoting me above, you can disagree with the idea 'edge to edge' it's a proper metaphor to use for an iPad display with a 9mm bezel. I agree with you in full. You'd want something like 2-3mm bezels for that, at most.

But to say it's not a metaphor is silly. It would mean that you think Apple, journalists and consumers, all or some of them, consider the edge-to-edge marketing statement to be a literal one, that must be taken literally, and is obviously a lie, rather than a statement which must be taken metaphorically.

If you go to the CEO of Apple right now and ask him in an interview, do you mean literally edge to edge, or metaphorically, what do you think the answer would be? (apart from dodging the question)

After that you could tell him it's a crappy metaphor to use. But telling him he's lying because Apple means it literally is just silly.


No matter what it's not a metaphor. You might want to look up what that word actually means.


I Google'd "define:metaphor" and it says "a figure of speech in which a word or phrase is applied to an object or action to which it is not literally applicable."

and... Grammarly says "A metaphor is a figure of speech that describes an object or action in a way that isn’t literally true, but helps explain an idea or make a comparison."

Soooo this supports the idea that "edge-to-edge" can be descriptive but not literally true, and be described as a metaphor, right?


The words 'edge to edge' do literally apply to the object in question, though. You can literally have an edge-to-edge display - see the Galaxy S9+ for example.

You can't just say "well even though the words could literally apply to the object and they do literally apply to the object in some cases in this case it's 'just a metaphor'"

It's not. It'd be embellishment if you want to be generous. But not a metaphor.


No.

The word "edge" in this context, literally means the edge of the device or screen. That's everyone's understanding. There is nothing metaphorical about it.

The word "lightning" for example, does NOT literally mean "fast". It literally means the electrical energy we see in the sky. When used to describe computer speed, it's obviously a metaphor, and well-understood as a metaphor.

You can't just claim something is a "metaphor" to excuse false marketing.


Yes you can. "a word or phrase applied to an object to which it is not literally applicable." 'Edge' applied to an object which is not literally the edge. Metaphor.

You can't just claim the definition is not the definition and expect people to side with you over multiple dictionaries.

That's everyone's understanding. There is nothing metaphorical about it.

Literally everyone? Or figuratively everyone? You think the Apple marketing people cannot see the bezel and honestly think is not there?


"Edge" refers to the edge of the device. There is no other edge metaphorical or otherwise, imho.

More than definition, how metaphors are used is important.

Is "edge" a metaphor when the actual edge is right there, 9mm from the screen perimeter? There is no metaphor only 'edge-to-edge' window dressing. In the same way "lightning" is not a metaphor in a discussion about thunderstorms.


As others have advised, you should look up how metaphors are used,

See my other comment in this thread, there is at least six years of prior usage by multiple big tech companies describing their product's screens as "edge-to-edge" without literally meaning edge-of-device to edge-of-device, with links.

And see my other other comment where I wonder why "edge-to-edge" has your goat, when Apple describing the iPad Pro as "all screen" doesn't. Is that intended to be taken literally as well? There's no CPU, no memory, no battery, no glass, no other components, all screen?

Because the device is right there, it clearly has parts which are not screen, why aren't you frothing at the mouth about how it's a lie intended to "attract suckers" instead of a non-literal highlighting that the screen is large?

Like "all butter cookies" - that's a lie for suckers to think they are made of butter and no other ingredients, right? Because there's no way it could be read except literally, is there?

Non-literal descriptions are everywhere.


> there is at least six years of prior usage by multiple big tech companies

Irrelevant. This discussion was about the one and only use in town right now of "edge-to-edge" in a major campaign to sell new tablets.

"All screen" is more ambiguous. For starters "all screen" is not a pre-existing term. It could be used to mean there's no other elements or buttons on the front, except the screen. "All screen".

"Edge-to-edge" on the other hand, has explicit meaning built in. The primary component of which is measurement. "Edge-to-edge" refers to not one, but two hard edges, and describes that which spans in full from one edge to the other. There's no vague interpretation possible unless you force a square peg through round hole of English. It's either an edge-to-edge screen, or it's not.

I had to measure a washing machine recently to find out if it would fit. I measured edge-to-edge, and by that I mean the actual left edge to the actual right edge. But you knew that already without me explaining.... because I said "edge to edge".


So every single phone on the market can be described as “edge to edge”?


My old 15" CRT monitor had awesome edge-to-edge technology. It was apparently way ahead of its time.


What is calling it "edge-to-edge" describing? Would it be equally valid to say that it has a 20" screen as a metaphor for how large the screen is?


¯\_(ツ)_/¯

Are you equally offended that this "floor to ceiling room divider" stops a couple of inches below the ceiling? https://www.amazon.com/Royhom-Privacy-Divider-Decoration-Apa...

Do you expect this "top to bottom house cleaning service" to include the chimney and roof and aerials because the "top" has a literal definition? https://top-to-bottom.cleaning/

Are you angry that "surround sound" only has a discrete number of speakers, usually 5 or 7, instead of a continuous surrounding panel?

Are you baffled by Thomson Video Networks' claim that they have an "all-encompassing video infrastructure" when you can see things in the world not encompassed by it? ( https://www.broadcastingcable.com/post-type-the-wire/thomson... )

Do you think "unmissable TV shows" are literally unmissable? ( https://www.makeuseof.com/tag/unmissable-tv-shows-watch-hulu... )

Why aren't you complaining about Apple calling the iPad Pro "magic"?

Why are you choosing "edge-to-edge" as the hill to die on, when Apple call the iPad Pro "all new" (it isn't), "all screen" (it's not), "all powerful" (it isn't), "a magical piece of glass" (nope) which "does everything you need" (even breathing?), "any way you hold it" (even covering the screen?), "true to life color" (even though you can't represent purples on an RGB screen?), "make everything look gorgeous" (even ugly people?), "the perfect machine for augmented reality" (even more than dedicated glasses? There can never be a better machine for AR?), "immersive games" when you can't immerse yourself literally in them?


I think the floor to ceiling divider is meant to literally reach the ceiling. It comes in different sizes. I would be upset if I bought one that didn't quite reach the ceiling because it was a few inches shorter than advertised. I would definitely be upset if I hired someone to make a floor-to-ceiling partition and it didn't reach the ceiling, unless floor-to-ceiling had a clear well-understood different meaning in that domain.

The surround sound does literally surround you. Surrounding a person doesn't require a continuous circle.

They said it's an "all-encompassing video infrastructure for broadcast and multi-screen services." That has a pretty clear meaning that it covers everything you need regarding infrastructure in those domains. It is not a metaphor.

"Unmissable" has a second meaning, "too good or important to be missed." [1] It is a subjective statement that the episode is good.

Calling the iPad pro "magic" is the only thing here that is a metaphor.

I'm not picking this hill to die on. I asked one question. You say it's a metaphor, but neither of us have any idea what it's a metaphor for. I think what you're actually trying to argue is that it's completely meaningless marketing fluff that can be applied to any phone or tablet. Would you object to calling a feature phone edge-to-edge?

I also disagree with "all screen", which should have the same meaning as edge-to-edge. All your other Apple marketing term examples have clear meanings (except "all powerful", did they really say that?)

A metaphor is when you say one thing is something else to draw an analogy between them. It doesn't work when the something else is the same type of thing. It would not make sense to say "my car is a Lexus" as a metaphor to mean my Hyundai is nice. It wouldn't make sense to say "my car is turbocharged" to mean it's fast when it doesn't have a turbocharger. If edge-to-edge is a metaphor, I can only think it's a metaphor for a tablet with a screen that doesn't have a bezel. It seems like a straight-up lie.

My biggest issue with the edge-to-edge marketing claim here is that I have no idea what they mean by it. It was bad enough when they made that claim for the iPhone X, but the iPad pro has a huge bezel. They might as well claim it fits in the palm of your hand.

[1] https://www.merriam-webster.com/dictionary/unmissable


You say it's a metaphor, but neither of us have any idea what it's a metaphor for. [..] I think what you're actually trying to argue is that it's completely meaningless marketing fluff that can be applied to any phone or tablet.

I probably am, yes. I mostly think it's silly to argue that it "can only be used literally" in the face of it being used non-literally. It's a metaphor for its literal definition - this screen is "edge-to-edge" like a screen with no borders. It metaphorically has no borders because it's so big and the borders are so small, even though it literally does have borders, but you won't notice them, you'll accept the screen is edge to edge because it very like one that is. Yes it does seem like a literally false statement to sell people on an idea.

Would you object to calling a feature phone edge-to-edge

I don't know. On the one hand I don't think it is, any more than I don't think the iPad Pro is. On the other hand, they can call it whatever they want, they will just say "the screen goes from the edge of the screen to the edge of the screen" and everyone will say "duh". Do I think it's harmful? Not so much because the screen is visible. "All day battery life" was more misleading because you can't see the contrary at a glance. "No fee" when there is a fee, way more harmful.

"Unmissable" has a second meaning, "too good or important to be missed." [1]

Well "Edge" has meanings "2.a. the line where an object or area begins or ends : BORDER" and "2.b. the narrow part adjacent to a border" and "2.c. a point near the beginning or the end" - https://www.merriam-webster.com/dictionary/edge

So "edge to edge" could meaninglessly but accurately be saying "from a point near the edge to a part adjacent to the other edge". :-|

(except "all powerful", did they really say that?)

Yep; top of the page here https://www.apple.com/ipad/ - All New. All Screen. All Powerful.

Meaning, presumably, "every bit of it is new, it's mostly screen, every component is high end".

It would not make sense to say "my car is a Lexus" as a metaphor to mean my Hyundai is nice.

"How is your new Hyundai?" "It's so luxurious and feature filled, it's totally .. i dunno, it's ... very Lexus! Yeah, my car is a Lexus by another badge!". You wouldn't understand that?

It wouldn't make sense to say "my car is turbocharged" to mean it's fast when it doesn't have a turbocharger.

"How is your new Tesla?" "FAST it's so fast it's turbocharged supercharged bullet train rocket engine awesome" "I don't understand, it cannot be turbocharged because there is no internal combustion engine exhaust gas to spin the turbine, why are you lying?"


> you'll accept the screen is edge to edge because it very like one that is

I think that's the core of the dispute here. The screen on this iPad is nothing like a screen that's edge-to-edge. It's surprising and confusing to see it described like that with such a clear thick bezel in all the images. I and a lot of other people in the comments here don't accept that this screen is edge-to-edge.


"a figure of speech in which a word or phrase is applied to an object or action to which it is not literally applicable."


But you can not just take any words and say it is a metaphor. I can not say "this is a walnut desk" when it isn't and just claim that it's a figure of speech metaphoring a very solid and beautiful desk. A metaphor must be widely understood as such before you can use it.


Just to be clear, "InfintyEdge" is a brand-name, like "Lighting connector", it is not a description of the display itself.


Exactly, which is clearly a made up marketing term that has no established meaning just like Retina Display is.

all-screen and edge-to-edge, however, are not.


"It's as fast as lightning" is a simile.


Yeah, no ugly "chin".

Problem is, the iPhone X family got rid of the chin... but have the dreaded notch, and not even the mostly acceptable teardrop everyone else standardized on.

At least the iPad Pro adopted chinless and also notchless design. First time in years Apple has made a product that isn't weirdly ugly.


>Problem is, the iPhone X family got rid of the chin... but have the dreaded notch, and not even the mostly acceptable teardrop everyone else standardized on.

You mean "copied with a tiny alteration".

>At least the iPad Pro adopted chinless and also notchless design. First time in years Apple has made a product that isn't weirdly ugly.

You probably didn't get the memo, but the "oh my god, it has a notch, oh noes, so ugly" thing some pundits tried to pull died in its birth, and not only competitors copied the notch, but the X itself became the best selling phone.

Even ignoring that, everybody pretty much praised the Apple Watch 4 as beautiful as well...


> the "oh my god, it has a notch, oh noes, so ugly" thing some pundits tried to pull died in its birth

Did you know that Google had to ship an update for Nexus 3 XL that "disables" the notch in hardware (by making the background for the part of the screen around it black, making it less noticeable), because of how many people complained about it?


there were phones that effectively had a "notch" long before the iphone x. for example, the lg v10 had a second screen directly on top of the main screen with the two front-facing cameras slid over to the left. the second screen was used to access recently opened apps, text messages, settings, flashlight, music control, etc.

in general, apple just tells people they invent and do everything the best, and everyone listens. it's funny, because my lg v35 is much thinner and lighter than my girlfriend's new iphone xs max (which has a marginally larger screen), doesn't have a protruding camera lens, and it actually has a headphone jack with a high-end dac (which basically no other phone has). so apple says they need to get rid of the headphone jack to save room, and yet here we are.

i specifically avoided the lg v40 because they added the notch. as with the new ipad, i simply don't understand this need to extend the screen all the way to the end of the phone in the long way. this is especially true on the ipad, which i nearly always hold horizontally with my thumb on the perfectly sized bezel. getting rid of the bezel literally adds no functionality for me and actually removes some.


"mostly acceptable teardrop everyone else standardized on."?


Yeah, that's pretty delusional. I don't think anyone has created a phone with a notch that would accurately be characterized as a teardrop shape. The closest was probably Essential (which inexplicably had a notch and a chin). The Pixel 3 notch is a V shape. The OnePlus notch pretty much looks like an iPhone notch. There's no "standard" and certainly no "standard teardrop shape".


I don't blame you. The one plus 6T released yesterday with the teardrop notch.

It is trend among chinese phones (Huawei,Oppo, Vivo, OP ..basically BBK electronics)

OP 6T is the first phone popular in the west with this style, and I find it to be amazing.

Still like the MiMix 3 the most, with no notch. But, the OP 6T is the first one that's not egregious.


Looks more like a "condensation drip about to fall" than a tear drop...

Not very catchy though, perhaps "innovative CDATF notch - uses 66% less screen space" would sound hype enough for the Marketing department?


What if I have really weirdly shaped tears?


far from everyone, but prominent examples are the OnePlus 6T and the Huawei Mate-20:

https://i.ytimg.com/vi/r_LNpJOd0ao/maxresdefault.jpg


That's not a "teardrop". Are people actually calling it that?

Also, two phones does not make a standard. Even the Mate 20 Pro has an iPhone-style notch.


Both have the slanted sides that I think "passes for" teardrop shaped, although it's clearly more slanted on the OnePlus.

And indeed I was referring to the Mate 20 and Mate 20X rather than the Pro.


From the keynote, Cook claimed the new A12X Bionic[1] is faster than 92% of portable PCs[2] (in English, laptops, Surfaces) sold in the past year, including some i7 models.

- 8 core CPU (4 low and 4 high power cores)

- 7 core GPU

- HEVC encoding/decoding

- Neural Engine (they haven't made any comparisons to the A12 sans X neural engine, so at this point I think it's the same)

- Audio DSP, Storage controller, Image Signal Processor, Apple performance controller, Depth Engine, &c.

[1] http://live.arstechnica.com/apples-october-30-2018-more-in-t...

[2] http://live.arstechnica.com/apples-october-30-2018-more-in-t...


The hardware specs of the new iPad Pro read like the dream mobile machine. If only there were not too many software limitations in iOS which would prevent it to replace a laptop for many tasks, it could take over the laptop space in no time.

- Files is way too limited. It should be possible to exchange the system supported data types between all apps via files. You can't even add a audio track to Music or a video file to the TV app as it is right now.

- The split screen feature is a good one, but so far I rarely found it useful. Not all apps support it, many apps differ in their behavior. The 50:50 split is nice, but the other split ratios I fould less useful. For some tasks you need overlapping windows.

- On the same page, one of my favorite features is the picture in picture for playing video - unfortunately again, apps differ in behavior and I don't understand the size limit. I would like to be able to put the video to any size.

- And of course, coding and running plain Unix utilities. While I am happy about the security the app model with sandboxing brings, for a "Pro" machine, this is too restrictive. Apple should allow something like Termux on the iPad, even if it were limited to a sandboxed file system, it would make the iPad soo much more useful as a computer. Apple might even release a lightweight Linux-VM as an app.

If Apple could remove these pure software-restrictions of the iPad, they could attract a lot of "Pro" users, I think. Disclaimer: I am an iPad Pro owner, fully in the Apple universe :). iCloud sync already makes it a much more useful device, but I keep hitting my head against the limitations. Running "Blink" to connect to my Linux server makes it almost a laptop, but is very limited.


Totally agreed on this.

I could do real work with a browser + terminal client, but have a few issues:

- None of the terminal clients really give you 100% of a normal keyboard (been a while, I forget what doesn't work right).

- Splitting the screen doesn't work well enough. I want 2 windows (browser and terminal) that are each 75% of the screen and the ability to switch between them, preferably from the keyboard. A 50/50 split doesn't work well, and switching the 70/30 split between windows makes a mess when it redraws my terminal on resize.


Yes, 2x75% Windows had come to my mind as well. At least the new swiping guestures let you switch between two full screen apps quicker. For the terminal client, have you tried "Blink"? I found the keyboard reasonably well working, you can even remap caps-lock to ESC. As it uses mosh to connect, it deals also extremely well with connection cuts (especially when the iPad hibernates the app in the background).


Sounds like you just want an A-Series MacBook (with decent keyboard?)


Overall the iPad has a very attractive hardware as a laptop competitor. It has a touch screen, the pen support, is fanless, can be used without the keyboard attached. The more "Pro" it gets, the thought of replacing a laptop with an iPad comes up, but ends at the software limitations.


Faster at -what-? People make tons of speed claims, but what is it faster at? I'll be interested to see real world bench marks. I highly doubt the new iPad Pro can outperform the newly released Surface Pro with the base i7. I mean...he did say -in the past year- and the new Surface has been available for...a few weeks. Also mind you that the new Surface line doesn't even used Intel's latest chips, the latest are 9th gen, but the new Surface is only packing the 8th. Honestly, when you think about it, their claims weren't overly impressive given they were comparing their hardware to the 7th gen Intel chips.

That's not to say what they've done isn't impressive and that they don't plan to kick Intel to the curb...but it is entirely possible this will be another PowerPC thing, where PowerPC outperforms for a while, Apple switches and then Intel gets their shit together and others are no longer able to compete. If Intel were to merely get a good integrated GPU that could compete with Nvidia's, there is no chance Apple could compete.


That is the question, now isn’t it? I too look forward to the benchmarks.

Something to keep in mind too, iPad Pros have a much more constrained TDP than your average laptop, so it isn’t merely just a matter of performance but performance per watt and sustained performance.

There is also this question: given a higher TDP, what could Apple’s silicon team really do? Although I’m not convinced it makes financial sense for them to replace Intel with custom silicon, I’m happy to be proven wrong.


This is entirely anecdotal, but my iPad pro from last year handles photo edits (lightroom, affinity) much easier than my i5-powered desktop with a 1060 GPU. Anything from opening the file, doing edits and exporting the file in various formats is just a lot snappier on the iPad. I don't know about the surface pro


I'm not denying that the Pro is a beast of a machine, but I also think the Lightroom on iPad is a slimmed down version, no? One advantage that the iPad really has is that the code the companies like Adobe use don't have to be backward compatible with various GPUs and hardware. I'm sure there is a ton of bloat in Photoshop etc that are just there because of older machines.

I am happy that the iPad Pro is pushing the limits, it is about time that the Surface Pro line had true competition, it will push both to be better.


It's a little slimmed down in the number of options on offer, it's essentially the "CC" version that's also available on desktop. But any limitations in features vs the full old-school desktop version is a result of design choices, not lack of power.


This was either a sharp prod for intel, or some serious signaling.


MacBooks _will_ transition to Apple's own chips at some point. My guess is that the A14 or A15 generation will be fast and powerful enough to make the leap.

This will be big especially if they can get major price or power usage advantages.


I'm worried they won't transition MacBooks to these CPUs, but they'll just let the entire Mac lineup gradually fade to black with intermittent refresh cycles and dead end features like touch bar.

I can well imagine Apple expecting people to plug in a keyboard and monitor to their iPad Pro, while it gets thinner and thinner.


While "intermittent refresh cycles" are certainly a thing for the Mac line recently, I think it's important to distinguish "new features that seem to be flops" from both "no new features" and "not caring about the Mac line." The Touch Bar may or may not be a dead end (I don't hate it the way many other people do, but if it went away I wouldn't shed a tear, either), but it represents a significant amount of engineering work put into a Mac-only feature. So does the love-it-or-hate-it† "Butterfly" keyboard. And the new ARM-based T2 chip, which handles tasks from encryption to signal processing to being the SSD controller, is also obviously significant engineering -- and again, Mac-only.

I don't doubt that Macs will eventually fade into the sunset, but I don't think it's coming nearly as quickly as either pundits or pessimists seem to believe. "Marzipan" may indeed be a first step, but I think it's going to be a Carbon vs. Cocoa situation, taking many years to fully transition. For iOS to supplant macOS, it has to essentially do everything macOS does. It doesn't have to do everything the same way, but if it's going to be a full replacement for a general purpose computing platform, it has to be, well, a general purpose computing platform.

†Technically, "grudgingly-tolerate-it" should be in there, but it doesn't flow well.


Just a note on the T2: it's a variant of the A10, more specifically, it's basically an A10 running a custom operating system (BridgeOS) and with the high performance cores disabled. There might be other modifications, but it isn't the kind of thing I would hold up as a triumphant piece of Mac-specific engineering. Not that it is a bad piece of silicon, just repurposed silicon isn't the sort of thing I would put stock in when it comes to the importance of the Mac to Apple.

The TouchBar was a good example though, even if it didn't receive the warmest reception on the market, it is still a good piece of Mac-specific engineering.


I wish they would take the touch bar idea, and apply it to the trackpad instead of the function keys…


Re-purposed engineering can still be triumphant.


> I can well imagine Apple expecting people to plug in a keyboard and monitor to their iPad Pro, while it gets thinner and thinner.

And at some point, Apple will introduce an "iPad mini" with no display


Arguably they are already powerful enough to make the leap in at least some cases, that said Apple does not push Macs in the same quantities as iOS devices as to justify wholly new silicon for the line. I have been in an argument with a friend for the past year or so about precisely when (or if) Apple will rip Intel out and use their own silicon. I think they will, but I'm going to try to present his argument to the best of my abilities, with my own observations and guesses as context.

Presently the T series is generally gimped variations of the A series. My understanding is that the T2 is essentially an A10 with the high performance cores disabled, and the low performance cores repurposed to manage the myriad of functions that were previously handled by dedicated controllers. They're also necessary for certain features like TouchID, and for Apple to use its own HEVC encoder/decoder.

Presumably the T3 will be keyed off of a later part like the A12 or a later generation, which will make it the first T series chip to also include the neural engine. Assuming they don't disable that part, it will potentially make the NE available to Mac application software.

So what's the real problem with ripping out Intel right now? Given a higher TDP, Apple could hypothetically design a much more performant chip that would replace both the CPU and GPU in their current Mac lineup, but they simply don't push enough Macs to justify the expenditure on custom silicon, right now at least. Probably doesn't help that they have been making existing customers less happy, but it also doesn't help that Intel has simply been re-releasing Skylake in different variations for a good few years now.

If you have a Macintosh able to run Mojave, an SSD, and at least 16GB of RAM, you are probably set. The important thing about Mojave isn't so much any one feature of Mojave, it is the continued support including bug fixes and security updates. Given all of that, most people simply won't need a new system any time soon. There's essentially no reason a laptop from 2012 can't take you all the way to 2022 all other things being equal, so we're starting to see sales stagnate across the entire PC market including the Macintosh, so while Apple is selling phones about as fast as they can stamp them out, the same is not true of Macbooks or iMacs.

Silicon is much more of a volume business. Apple can re-purpose older versions of their A series chips because those are an R&D and capital investment that has already paid off, it isn't a new design, and they just need someone to continue fabbing them. New generations of the A series chips will generally pay off, because they will be selling hundreds of millions of the same iPhone, and usually for greater than one calendar year. This past year, for the first time since the A7 was released, they skipped releasing an -X variant of their A series chips which are generally released to support new iPads (there was no A11X for those who may have forgotten). It is possible that iPads have not been selling in the quantities necessary for even a variant design, and rather than an annual refresh cycle, we could reasonably expect them to move to a biennial refresh cycle, hence why they have skipped the A11X and jumped straight to the A12X.

Intel and Qualcomm have customers, but for Apple's own silicon, Apple's only customer is itself, by design. That also imposes a limitation on themselves that it just might not make financial sense to fabricate a series of chips exclusive to their Mac lineup, appropriate to the TDP of a Mac.

By all means, tear this apart if you disagree.


Very well put. It seems to me that desktop performance has stagnated for the most part while mobile performance continues to make leaps and bounds, so people think they need to upgrade their phones and tablets much more often just like the early days of the PC. A 2008 MacBook is going to be a lot more useful than a 2008 iPhone in 2018.

Apple certainly markets their iPads as a competitor to the MacBook, so maybe they’re not looking to use custom silicon in the Macs but simply to phase out desktop computers when iOS and third party applications are ready to replace PCs for the most part.


> Given all of that, most people simply won't need a new system any time soon. There's essentially no reason a laptop from 2012 can't take you all the way to 2022 all other things being equal, so we're starting to see sales stagnate across the entire PC market including the Macintosh, so while Apple is selling phones about as fast as they can stamp them out, the same is not true of Macbooks or iMacs.

Isn't this becoming equally true for phones? Perhaps not for hardware as far back as 2012, but I'm expecting my 6S to last me a very, very long time. It's fast, has a great screen (and headphone jack), and I'll get the battery replaced as many times as I need to.

But this doesn't appear to have slowed iPhone XRS+ Maxx sales one bit.


I read a report recently that the average replacement cycle for iPhones is closer to 3 years now than it was this time last year. I would say this is true-ish of phones, but the reasons people upgrade are varied. I’m using a 6s+ right now and the reason I’ll be upgrading the next time I go to the Apple Store isn’t because this phone is in any way bad or worn out. What I’ll really be purchasing is a new and improved photography pipeline and bigger disk. Everything else is just a nice extra, but if I didn’t care about the camera I would probably continue to use this phone until it died of natural causes or starved to death from a lack of software updates.


totally agree. i still have a 2011 macbook pro upgraded with an ssd and a 2013 macbook pro. both almost indistinguishable performance wise from a brand new macbook I use at work. there's nothing it can do that these can't really, except touch ID, but then my apple watch unlocks my laptop now anyway.

i would add that the faster upgrade cycle, and therefore money worth spending on developing smartphones is also because people are used to carrier subsidy and spreading out the cost. so it's kind of a virtuous circle of investment that the pc market doesn't have.

makes you realise what a perfect storm iphone was - a massive and growing market of people paying through the nose, annually, for a crap product, just waiting for someone to take that money and invest it in something good.


I don't disagree, but I do want to point out that they did release an X variant of the A series with the new iPads.


Ah, I just re-read my post and I was unclear in that portion because I edited it too much.

I meant to say there was no A11X. I'll edit that in shortly.


Microsoft is preparing the move to ARM too. I'd be really interested how the performance will be. The Intel Mobile Processor (i7 + M7) felt really slow before the eight generation and still used a lot of power (at least in the Lenovo and HP Business notebooks).


Is Microsoft preparing an x86 emulation layer for their ARM-based OS? Because as the past has proven, nobody wants to buy a Windows PC that can't run Windows software. Unfortunately, it's almost all been compiled to run on x86. Or are they still betting on moving everything to UWP?


Microsoft is shipping an x86 emulation layer on the ARM based Windows laptops right now. (although with some limitations, e.g. as far as I know no 64-bit support yet, and of course it's slower than native on already not-so-fast laptops)

Benchmarks from the first device generation: https://www.techspot.com/review/1599-windows-on-arm-performa...



I think by the time MBPs are using A chips, iOS will have morphed into the new macOS — launching these together as a new MBP.


Or pure marketing. Look for the asterisk that explains the workload.


Yet 90% of the apps that people will use on it are optimized for a phone with lower specs. If I use a laptop with GPU I know I am going to use games or productivity apps (Photoshop, modeling software) that uses the available computing power. With an iPad I am not so sure.


Checkout the Photoshop on iPad Pro demo from the Keynote then. Apparently AutoCAD is also being ported to iOS. The difference in available computing power is greatly diminished to the point that the real advantages Macs have nowadays is in access to vaster stores of RAM, Dev Tools, and Darwin.


What's the difference? iPad apps use the full potential of the machine, like in any computer.

Yes, most iPads in existence are much slower than this iPad Pro, and yes, most people will be buying the inferior sub $400 one.

But also, most people have PCs with an Intel GPU, or a really low end discrete GPU... that doesn't stop game makers to make games that can't run on most machines, or Adobe to stop making photoshop.

Let's see what happens with the iPad.


This is changing, Photoshop(coming), AutoCad(coming), Civ 6, Procreate, the next MS Office mobile version, Bias Amp 2 are all examples of apps targeting iPads. I think this will continue and even go faster


The more hardware support for HVEC the faster the scene will move to x265, the smaller files everywhere. I'm excited!


I wish they had a different keyboard cover though. I had a lot of problems with the iPad Pro 12.9 on my lap, because it isn't very stable.


There's a point in the press conference where the presenter went "And you can scroll the UI as fast as you want, and it doesn't lag! We have 8 cores in our CPU!" (or something: not literal transcription) and the entire audience clapped.

That's about as serious an indictment of the competence of the software development industry as I can imagine.

Edit: to clarify: the line is "It's a real PSD... it's over 3 gigs, it's a 157 layers, 12000x12000 pixels, and I'm zooming through it at lightning speed, with no lag". While she's doing this she's scrolling the layer list, which is why I was confused in my earlier statement.

Note, btw: when she's zooming, it's not loading the higher resolution in "real time". It takes a second or so for the higher resolution to pop in. So when she's saying "zooming through it at lightning speed", what she's saying is "the UI doesn't hang when I zoom, and the higher resolution pops in quick enough". This should not be an applause line for a device of this power in 2018.


Scrolling through layers in photoshop doesn't necessarily just scroll a list though, it's also actively selecting the content of those layers... which can be very complex (I didn't watch the demo so I'm not entirely sure of the specific scrolling context).

You're dealing with filters, transparencies, hand-drawn paths, masking, linked files, embedded files — there's a wide realm of potential data and complexity.

If you want to test the mettle of a computer, try opening a large Photoshop file and doing anything in it.

People who use Photoshop regularly totally understand the validity of that quick demo.


If you want to test the mettle of a computer, try opening a large Photoshop file and doing anything in it.

This, right here, is why it's an indictment of the software industry. It SHOULDN'T be impressive that Photoshop doesn't lag. When she said that line, the entire audience should have shrugged and said to themselves "Of course you can zoom into an image in Photoshop. It's 2018. What's the big deal?"


I don't know enough about software to contest that statement, but I suspect you could be underestimating the the complexity of what Photoshop does.

We're also talking about multi-gig files. Outside of media the occurrence of an individual application file this large is pretty small. It's not like you can reliably play a 4K video file on many entry-level machines either.

In a similar vein, I would argue that it's a better indictment of the industry that I sometimes have to close Chrome because it's hogging more resources than Photoshop.


I think the obvious point is that whilst what photoshop does is very complex and computationally intensive, it's not new. The vast majority of the functionality in an average photoshop file is the same as was in the product 15 years ago. We really have reached the point of diminishing returns. And doing those tasks 15 years ago wasn't slow, so it's questionable why photoshop hasn't scaled in speed with the hardware.


The demo file shown at the event was 12000x12000, 127 layers. It was 3GB on disk, 5GB fully loaded into memory. Double-tap to zoom was happening at 120Hz, fully animated, to anywhere within the document.

I realize I'm biased, but on such a machine, the observed performance is anything but rudimentary.


If you're just viewing it, the pre-raster is only ~500MB, which easily fits in memory and zoom and pan are free. Pretty much any 2GB+ device would do that. Now editing and changing visibility and filters and such are much more complicated, but a view demo is not really a workout for anything vaguely done correctly.


It's not impressive if it's not doing it with ideal sharpness in real-time, which it wasn't.

Tiled zooming into arbitrarily large images is otherwise rather easy to do. Go plop the largest image you can find onto your nearest Android phone and it'll zoom & pan that image at 60fps with nary a stutter or issue. It will take a while for it to re-snap the sharpness as you zoom in, though, but so was the iPad Pro. How fast it was re-decoding at the new scale factor & crop is the interesting question, but one which wasn't covered.

Otherwise we're just talking the same basic tech behind gigapixel viewers. Which isn't interesting or significant in the slightest, and has been used on mobile for years and years and years.


> It's not impressive if it's not doing it with ideal sharpness in real-time, which it wasn't.

Do you have evidence backing up this claim?

> Go plop the largest image you can find onto your nearest Android phone and it'll zoom & pan that image at 60fps with nary a stutter or issue

I've never seen an Android device composite, pan, and zoom a multi-layer gigapixel image with the same performance I saw today. If you have a video you could share, I'd be interested to see it.

> It will take a while for it to re-snap the sharpness as you zoom in

To let you in a bit, Photoshop was doing the recomposition of the destination region in real-time. And it's not just a single layer that's on display there, but hundreds, composited together while the user zooms in and out on varying regions of the document.


I've once tried using different devices to render a down-scaled version the full-stitch of XKCD Click and Drag. No device I've tried can allow me to zoom and pan at 60fps. On my Mac, there is some noticeable latency when quickly zooming or panning, and it's definitely not 60fps. I have no reason to believe any old Android phone will do it better.

Now I don't doubt that an image viewer specially optimized for zooming and panning huge images like this can do it at 60fps. The typical image viewer likely does not.


You think a many-layered PSD file is comparable to zooming a single image?

That's like wondering why the source + instrumented dev environment uses so much more resources than its compiled, optimized binary.

What a weird confusion to see on HN.


Of course, because it is. It's going to do a tiled render, likely at powers of 2, and all the heavy many-layer PSD stuff is completely asynchronous from the actual panning & zooming.

Or are you so naive you think that anything of significance is happening on or blocking the UI thread such that merely showing "smoothness" in any way reflects anything about the device's performance at actually doing the photoshop work?


I know you're trying to make the point that most software is bloated/slow/etc. but this is not the example to use for that point. Things don't just get faster because it's current year. The size of photos has increased by orders of magnitude since Photoshop debuted, the size of screens has increased, and the capabilities of the program have increased. In Photoshop, it is not only plausible but commonplace to have multiple 50MB files visible simultaneously, some raster, some vector, some text or shape. It's also worth considering that Photoshop only really uses 2 cores at maximum anyway.


> It's also worth considering that Photoshop only really uses 2 cores at maximum anyway.

Seriously? I think this is more confirming that the person you replied to is right. Unless there is some reason that I can think of that only using 2 cores is a good thing?


What would you do with more CPU cores? One for the UI; one for CPU-only filters (if there are such a thing) and various other background workers like network downloads.

Everything else—the rendering, scaling, compositing, etc.—runs on the GPU, not the CPU, where it uses hundreds of cores. Y’know, like a 3D game.

And, like a 3D game, it’s still slow, because it’s doing something genuinely hard and your powerful GPU still isn’t powerful enough.


Answering this question means you have to understand a bit about how computers work. More cores doesn't automatically translate to faster performance. Photoshop is lightly threaded for most operations.

A cursory search brought up this link, but you should read up on threads and the difference between CPUs and GPUs:

https://macperformanceguide.com/blog/2017/20171223_2311-why-...


These are things that are processed better by a smaller number of more powerful cores than a larger quantity of cores.

Multiple cores are really good at handling a lot of smaller processes. Photoshop just has a couple really heavy individual processes.


Not “orders of magnitude”, no.

Demo was 12,000 x 12,000 pixels, 2 orders of magnitude would be pictures of 120 x 120 pixels.

We did more with Photoshop in the past than 128px icons.


12000 x 12000 = 1.44 * 10^8 pixels.

120 x 120 = 1.44 * 10^4 pixels.

4 orders of magnitude. And in addition to that, the RAW sensor information from the camera has given each pixel more 'depth'. Often times, each color is 12bit (36bit per pixel). There's more to each pixel than what's displayed on the screen: its not just a JPG you're viewing.


Why does Photoshop use only 2 cores at maximum?


Then write a competitor to Photoshop that performs better, since doing so is not a big deal.


I think the app was applying filters to a large image on the fly as she was doing that, so it’s not unimpressive.


There are plenty of times when I say "this software is bloated, inefficient, and should be snappy".

Scrolling through a 3 GB 157-layer 12000x12000 image is not one of those times.


It was a 3gig PSD file. That is pretty impressive.


No! It's not! Not in 2018! You only think it's impressive because you're used to UI's being total fucking shit! That's why it's an indictment of the software industry!

I'm not saying it's not hard: it is hard. Programming is hard. Making sure you're not occupying the UI thread for more than 16 milliseconds per frame is sometimes a challenge. You have to use all sorts of scary stuff like "concurrency" and "threads" and "GPUs". But it's 2018: non-laggy UI's should be the baseline everywhere, for everything.


Again, the demo was performing some very impressive operations on the images in real time, not just making sure the UI was responsive.


Sounds like you just don't understand the challenges of what was demonstrated rather than a lamentation of modern software. Like reading kids on /r/gaming complain that a game doesn't have unlimited draw distance yet have no clue why there's a limitation.


What exactly should people be impressed by, if not hard things done well?


I'm glad somebody else feels this way, when my Amiga 1200 runs smoother than most modern computers you know something has gone horribly wrong.


Yes, my washing machine also runs completely smoothly. One turn of the dial and it has selected the correct program without any lag!

Apples and Oranges.


I just had my whole Mac lock up after beginning the extraction of a xip archive. It is absolutely unacceptable for the UI to stop responding no matter how much load my computer is under.

My Amiga never did this because the whole user interface was interrupt driven, it wasn't possible for any of the bloated trash running to deem itself more important.


Yes, I edit 12000x12000 Photoshop images in real-time on my TRS-80 Model 4 every day. Kids these days.


How big are the images you view on it?


I feel you are not wrong. But I have no hope that quality and performance will be demanded any time soon, considering how vehemently Javascript desktop software is defended here with terms like 'time to market', 'users have spoken' etc


Was this during the presentation where they were dealing with a huge photoshop file?


Yes, I'm critical of the presenters that Apple chooses often but I thought that comment made sense in the context, especially coming from someone that was likely more an Artist than a technologist.


Maybe I'm being harsh, but I really don't think that it should be impressive that, on a device of this power, you can zoom in and out of a photoshop file without the UI lagging. Note, btw, that the pixels aren't updating in real time, it takes a little bit for the higher resolution to pop in when she zooms in (unless they're recently in cache). She's literally talking about not having the UI lag. This shouldn't be an applause line, this should be basic functionality in 2018.


It's a 12GB file with literally thousands of image filters being applied in realtime on a tablet. It's not about the UI it's about the image selection and realtime filters on a GIANT composite image. That's one hell of a lot of processing happening.


Yeah, if you’ve never pushed Photoshop and think it’s just a photo editing software then it’s understandable that you’d feel this way.


This exactly. People don't seem to grasp just how powerful Photoshop is, and just what kind of power you need to utilize it.


I’d assume so. I don’t recall any other mention of scrolling performance.


UI speed has always been a feature. At one point in Mac OS X, finder window resizing was very laggy.

I think Steve even did a keynote demo of fluid finder window resizing during 10.2 release.

reference: https://macosx.com/threads/why-is-window-resizing-so-slow-st...


Shouldn’t you be more specific and say an indictment of Photoshop or Adobe?

Perhaps they have a lot they can optimize but the levels of layers, masking, opacity, blending or whatever is complex.

There’s a lot to be said about reduced efficiencies with the layers of languages and abstractions we use today but I’m not sure photoshop is the best example to try and torch the whole industry.


Why none of the top comments on a site called Hacker News are about the fact that this is far from anything that could be used for software development surprises me. Only feasible way to use this for dev would be using a cloud IDE, since I have no access to the OS or ability to install dependencies.

I love the idea of the iPad becoming something that could replace a laptop, but for me I wait for the day when I can at the very least have sandbox access to the filesystem, and OS kernel to be able to set it up for local development.

Why Apple continues to refuse support for BSD Jails or the sort confuses me.


Not all of us are developers. I'm a 3d artists who writes some scripts.

"A computer hacker is any skilled computer expert that uses their technical knowledge to overcome a problem."

About the BSD jails, it's probably to keep it simple for most of their user base? Power users aren't really their thing anymore. Although I bought a last generation iPad Pro with Pencil for drawing / some 2d animation. I like it better than Wacom now.


While my laptop was away getting fixed, I used an iPad Pro, Textastic and a linux instance on Digital Ocean to do a reasonable amount of Rust development. I was surprised how well it worked.


For Nomads it is actually a good way to work, too, because it is more secure for your clients in a way.


I agree that Apple needs to improve developer experience on the iPad. However, regarding the file system access: Apple already has a Files app that handles this very well (introduced with iOS 11).


I agree.

Personally I think Chrome OS will be so much better for developers once Google gets all the wrinkles ironed for Linux support.


I see where you come from, but there is a lot of software that can be written without sandbox access to OS’ kernel or filesystem.

In particular, wanting to run absolutely everything in local is reasonable, but not a requirement I think.

Currently the only thing that’s really lacking is direct git integration into Coda or the other editors, and I know Panic is working on it. Git apps can be used to bridge the gap but it’s just too combersome.

Otherwise, accessing an external DB , or even having my code run on a VM somewhere else is not an issue as long as I can remotely debug.


This is a repost of: https://news.ycombinator.com/item?id=18338048

I was hoping we would see XCode for the iPad. The Swift Playgrounds app really does not cut it.The Swift Playgrounds app really hasn’t gotten any love[1] since it was first introduced. You can actually import UIKit and the other frameworks and compile them on the iPad but this is pretty much undocumentated. The preview views are tremendously buggy (really the whole app is) and there is all sorts of hidden debug stuff running in the background[2] that slows down performance. Everything has to be in a single file so doing anything moderately complicated is a pain, the editor is buggy and will sometimes highlight just the last three letters of a word, and there is no Interface Builder[3]. I would love a real XCode.

[1]There have been new playgrounds added (e.g. “Learn to Code with Swift”) but the app itself really hasn’t been changed at all. The same bugs have been present for years. And they aren’t obscur either. You will run into editor bugs within the first 5 minutes.

[2] The debug stuff is to help facilitate the different playgrounds, but doesn’t really provide any benefit to someone coding in a blank file outside of a playground. You can’t turn it off within the app.

[3] I know a lot of people don’t like IB, but it’s great for throwing together simple apps quickly. It’s odd that it is absent from the official apple App but Pythonista, an app made by one person, has an visual interface editor.


XCode on iPad seems like a WWDC type announcement. And if there is some sort of new declarative UI that will target iOS and macOS they may just wait a year or two until that is ready (not that XCode as is would be ready today even if Apple wanted it). If they can get Photoshop on the iPad I’m optimistic we’ll get some kind of iOS IDE in the next 9-18 months.


I can't imagine XCode could run in the RAM available on an iPad — maybe rework it to do the heavy lifting in the cloud?


iPad Pro has 4 GB of RAM. There are Macs with that much RAM that can run Xcode.


The 1TB configs have 6 GB of RAM making it even less of leap to assume they can do it soon.


But iOS doesn't have a concept of swap like a traditional PC OS like macOS or Windows does, does it?


No, but iOS can kill apps in the background during times of high memory pressure.


So it can run xcode for about 4 seconds until the OS decides to kill it?


> in the background


So when you debug your app?


Killing Xcode would also end your debug session. Presumably Apple will think that far ahead and grant the necessary entitlements so that this doesn't happen.


Correct, it kills apps before they swap.


iOS still runs the darwin kernel. I'd be shocked if it didn't use swap.


It doesn't, I believe mostly for reasons of keeping apps from using large amounts of memory in general, and by helping keep write cycles to a minimum.


I've always figured it had most to do with keeping responsiveness high and predictable.


It would be very unlike Apple to lean on cloud tech for this kind of thing. iPads won’t run Xcode until they have sufficient specs to do so on their own.


This page makes it sound like the Apple Pencil and the foldable keyboard come in the box, when in fact they are sold separately and are very expensive.

So on top of your £769 for the entry level 'Pro, the Pencil is an extra £119 and keyboard is £179, topping the whole package to £1047. Even more if you go for the 12" model.


You forget that the iPad Pro also has a full digitizer on it.

https://www.bhphotovideo.com/c/product/1408960-REG/wacom_dth...

What are your thoughts on this Wacom Cintiq Pro? Do you think this is too expensive?

A lot of designers I know are replacing that with an iPad Pro. For many, the iPad is a much more flexible tool AND cheaper AND more powerful. The Cintiq still has it's place and for people that very specifically need it, it's useful. But many are finding that an iPad Pro is a better tool for them.

So that's the pricing perspective you need to understand the iPad Pro.


I was commenting more on the fact that the pencil and keyboard are sold separately, even though they're pictured extensively on the information page about the iPad Pro

It gave me the impression they were included in the box


People in the market for a device of this price are not usually price sensitive, especially to peripherals, so this isn't much of a surprise.

Also, I use an iPad Pro without a keyboard or pencil, so I saved a good 300 clams, and I am really happy with the rest of what I got.


Very impressive demos, but unfortunately the number of apps that can make full use of the increased power is still limited.

A common complaint I saw during the presentation was the emphasis of the A12X for gaming, but Apple is still very bad at incentivizing developers to make games with premium experiences and prices (to say nothing of the lack of an official gaming controller!)

Also, the new Apple Pencil will only work with the new iPads; it's not compatible with previous Pros: https://www.apple.com/apple-pencil/


> Also, the new Apple Pencil will only work with the new iPads; it's not compatible with previous Pros: https://www.apple.com/apple-pencil/

Because the older pros don't have a way to charge it.

Which is to me another one of those ingenious but now obvious design choices, I know the sGNote series has "wireless" charging for their stylus, but it needs to go into the device to be charged, this is so much cooler because it doesn't require that. I wish that Microsoft had thought of this for the surface pen.


> Which is to me another one of those ingenious but now obvious design choices ... I wish that Microsoft had thought of this for the surface pen.

I'm kind of glad they didn't. A pen magnetically attached will likely pop off when you put the tablet in your bag. It's a nice place to dock the pen when you're using the tablet, but poor place to store it in a bag.

People report up to a year of life with the Surface Pen battery. Swapping it out when it dies seems simpler than remembering to charge it regularly.


Yup. I've used tablet PCs since the Windows XP Tablet Edition days and magnetic pen docks have always been a big no-no. The pens fall off more when handling because you bump into them when repositioning and they're easier to forget or 'lose' because the docking area generally becomes the one place you don't put it. In short, it might be counter-intuitive, but you need to be slightly more engaged with the stylus's maintenance than most people think. Too cumbersome and it impedes productivity, but too 'intuitive' and it will be lost in a week. But, I'm sure Apple will be happy to sell you a replacement!

I've had a Surface Pro 3 since release and have had to replace the stylus battery twice, so I'm getting ~1.5 years per battery. I would much prefer the old hidey-hole stylus dock to Apple or MS's current stylus docking.


"Hidey hole" means you have a bunch of wasted space in your device and encourages them to make the pen tiny, though. I much prefer the independent (and full sized) stylus. I rarely use the magnetic docking, though.

I fully agree about the pen falling off when bumped. I can't imaging Apple solved that since it seems like a fundamental side effect of the design.


> A pen magnetically attached will likely pop off when you put the tablet in your bag.

The surface pen already magnetically attaches to the surface, this would just give that attachment another purpose.

> Swapping it out when it dies seems simpler than remembering to charge it regularly.

You don't have to remember anything, you just "redock" the stylus when you're done using it and it will charge.


> The surface pen already magnetically attaches to the surface, this would just give that another purpose.

I'm aware. But it's not "just another purpose". It's a different design. A battery that needs charging is a fundamentally different design choice than a battery that needs replacing. I'd rather a disposable battery that lasts for a year than a rechargeable battery that lasts a week.

> You don't have to remember anything, you just "redock" the stylus when you're done using it and it will charge.

And then you shove your tablet in your bag and the pen falls off. The pen languishes in the bottom of the bag until the next time you want to use it, when it may or may not still be holding a usable charge.

A pen that needs to be stuck to the side of your device regularly is fine for a device that sits on a desk all the time. It's less clearly beneficial for a device that gets handled constantly and goes into and out of bags.


>I'd rather a disposable battery that lasts for a year than a rechargeable battery that lasts a week

The science isn't quite there yet with those uranium batteries


Unless the Note series has changes dramatically, which is always possible -- I just ditched my Note 3 -- they're using passive Wacom pens. No charging necessary, the internal slot is purely for storage.


This was definitely still true through the Note 8; but the Note 9 S-Pen has an onboard battery (mostly, I think, for the remote functionality; the core stylus functionality might use the same tech as before.)


The Note9 added bluetooth to the S Pen (you can use it as a remote camera shutter now), and it charges inductively when it's inside the phone.


I figured that'd be the case, but still disappointed.


Yes, my wildest dream is that they will release XCode with developer's tools (i.e. compilers), my laptop will be gone that exact second.


Why are people so eager to leap from computer OS's (MacOS or even Windows) to these walled gardens like iOS? I can't imagine developing software on iOS, because you know the deployment procedure will be straight to the app store. This ends the ability to create, distribute and install software yourself. Is this really what we want???


Because the Bazaar is grating.

Out here in the free world, every time you want to use a library, you have to choose between 12 different options, all suck in one way or another, and you really have no way of making an informed choice. So you pick something by gut feel. Half the time you are wrong, and all of the time you worry that you have made the wrong choice.

But if you buy into Apple's cathedral, they tell you what to use, and you use it. End of story. It's always behind the times, and it won't run outside of Apple's stuff, but it works (for now, at least), it's well documented, and you don't have to spend days finding and comparing solutions. You can get back to writing code, which is the fun part.

Is it good business? Good for the world? Maybe not. But it's certainly attractive.


Yes, it's what most people want. As a developer, I do not miss the days of trying to drive people to my website to find my app. If you were lucky you could get a mention on a well-known website or one of the many download aggregators, but it was hit and miss. The Mac had some very good ones, but with Windows, not one was big enough to be worth your time. It was a nightmare. Getting noticed on the app store is non-trivial, but at least your product is already in the place where users are going to find stuff. It's one less thing to have to worry about.

As a user, I don't miss having to search the general web to find something for my computer, only to be misled by poor results in the search engine, malware download sites, software that would be the perfect solution but doesn't run on my OS, ads everywhere, etc. Now I have one place to go and everything on it works with my system. Sure it's not perfect, and discovery could be improved, but there's no malware that I've ever run into, and everything I actually need is there.

Also, it's a false dichotomy between PC OSes and iOS. Both macOS and Windows have app stores that their users use.


at least your product is already in the place where users are going to find stuff

Along with a million other apps.


I would love it if there were an app like Termux for iOS (if there is, and I’ve missed it, please let me know!). On Android, Termux lets you use the local command line in some sort of chroot where you can install packages with apt. If this were made available to iPad users and it integrated well with Xcode (one of the issues with sandboxing), it wouldn’t be very far off from the macOS experience.


Yes, because this walled gardens offer us very nice tooling similar to the Xerox PARC days of yore, instead of lego pieces that we need to sort out new every year, while offering sandboxes that actually protect $HOME from random apps.


Because their experience (and probably livelihood) rests with iOS apps. No one is buying an iPad hoping that some day they can write kernel patches on it. If you make your money off an ecosystem, you are going to buy hardware for that ecosystem.


ditto if I could use Xcode on the iPad Pro I'd almost never use my MacBook Pro. That may be why it hasn't happed yet as many of us probably feel that way.


>That may be why it hasn't happed yet as many of us probably feel that way.

I don't know about that; one of Apple's hallmarks is that they aren't afraid to compete with their own products. Some companies would have said "We can't make the tablet into a powerful computer, it would cut into notebook sales.

Especially with the USB-C change, these iPads are a serious competitor to Apple's laptops for pretty much anyone except developers. In a lot of ways they're better. Just not for programming because the software is still too sandboxed.

Here's hoping that Xcode shows up.


Same here. Especially with the new hardware from this year. Apple puts a ton of engineering into these new iOS devices, a little sad that the Mac lines don't really get the same treatment.

I feel we are insanely close to this reality though. We have large enough displays now (even on the small one), external display support, external keyboard support and USB-C so we can actually plug in an iPhone into the iPad itself (Though this could have been skirted with the advent of wireless debugging support in Xcode. I'd still rather have the wire though.) Only thing I really want is some kind of trackpad or trackpad replacement. Or even just mouse support and I'll drop one in my bag.


Xcode would need a shitload of polishing to approach the stability and UX standards of an iOS application. I desperately wish they would have a few release cycles of pure stabilisation work.


That was the whole point of Xcode 10…


10 has been much better for me.


I like my mouse too much for this to be possible.

Selecting/moving text with a finger is more annoying.


I was expecting that. Then I remembered the iPad doesn't support a mouse.


I will be good with a wireless trackpad.


>> (to say nothing of the lack of an official gaming controller!)

Apple themselves does not publish an 'official' gaming controller, no, instead they offer a number of third-party controllers at a quite nice build quality at prices that are comparable to buying a new PS4/XBOX controller:

https://www.apple.com/shop/accessories/all-accessories/toys-...


It's about developers having an install base to make games against. They need to ship a controller with an apple tv, otherwise gaming stays in the casual corner it usually is in iOS.

These iPads are more powerful than any nintendo switch, but nintendo has them beat because they have a dedicated gaming hardware ecosystem. I wish I could use my iPhone like I would a switch, but I can't because there isn't any serious games or first party hardware.

As a result, serious game devs target mobile last, if they do at all. I bought Transistor for my apple TV for example and asked for the first refund of an iOS app ever because it was an crashy mess. I played it on my PC instead.


Making a controller an optional peripheral means that support is an afterthought for most app makers and that games are mostly designed with touch in mind. Nintendo and Sega have showed time and again throughout their histories that optional controllers get little to no 3rd party support.


That's true; I do have a Steelseries Nimbus, although it feels a bit jank compared to console controllers from other companies.


Doubt they would do it, but their own spin on the Nintendo seal of quality, with a floor of $50-$60 USD (and their own spin on "Selects" a few years later dropping some of those prices to $20USD) with some mandates on what that seal of quality means (performance, software features, hardware support, &c.) and they might turn that gaming ship around.

Among their requirements they could list: Apple TV with Controller support (allowing devs to ditch the useless support for the Remote), some online features (some rebranded version of Game Center), Mac support (for Macs that support the latest iterations of Metal).

It's the little things.


Interestingly, one place where this race to the bottom is NOT present is in iPad music apps, synths and DAWs. Prices range from 10 to 50, and the apps are polished.


You are right about the prices. Expecting a $60 AAA console game for $5 is not going to happen. Free to play AAA games where you end up spending $1000 to do everything means players end up hating the game and paying for fewer games. The alternative is games with truly massive player bases like Minecraft or Fortnite, which isn't scalable either.


Nintendo seems to be doing quite well at this. Why would we not have the same expectation for Apple?


Because Nintendo only started being able to do this within the last year and even then only a few big AAA studios are really on board. AAA games on the switch are really limited to in house Nintendo based IP and Bethesda. There are a few others but not a ton. Notable exceptions include Epic with Fortnight but that's also on iOS so it doesn't really prove the point.

So really Nintendo isn't quite doing it yet. Said as an absolute lover of the switch.


Fair point, but their SDK is allowing indie developers to fast track access to the ecosystem and hardware.

If Apple had a snap on controller like the one we're seeing Microsoft leaking, it would be game over. A great controller with great support on a refreshed Apple iPad mini would clobber everything on the market and it wouldn't even be close.


> unfortunately the number of apps that can make full use of the increased power is still limited

Isn't that a good thing? For example, wouldn't it be great if all web apps were optimized for specs lower than the machines we use?


Counterpoint: A full version of Photoshop is coming to iPad in 2019. Adobe announced it at their conference recently. I suspect they'll bring full versions of Lightroom, and all their other Pro Apps soon to iPad.


Yep; here are live apps:

* Lightroom CC on iPhone/iPad: https://helpx.adobe.com/lightroom-cc/how-to/lightroom-mobile...

* Just recently announced Premier Rush: https://www.adobe.com/products/premiere-rush.html

Planned/Announced: * Photoshop: https://www.adobe.com/creativecloud/photoshop-ecosystem.html * Project Gemini (for drawing/painting/illustrating): https://theblog.adobe.com/introducing-project-gemini/


It always takes time to fully utilize a new platform's specs.


I remember 2005 when was running Windows with Linux in a VM, Photoshop, Firefox, my Rails application and all of the other software I needed for development, music and everything else on a 1GHz AMD Athlon and 256MB SDRAM.

When it comes to software that’s not games and stuff like video editing it seems to me like the most of what happens is “It takes time for new software/updates to show up that once again are able to waste the full extent of my newly acquired processing power”.

I know we get more robust games now, I know we render 4K videos instead of 460p, I know we display UI on higher resolution screens but there’s really no excuse for a software with similar feature scope to take the same resources percentage of modern top of the line hardware as mid-tier 18 years old machine.

Bummer ;(


Does the old Pencil work with the new iPads though?


Per the site, apparently not (guess the "can't charge" rationale works both ways)


I'm curious whether you can actually import RAW photos into this iPad. According to [1], importing huge (42 Megapixel) RAW files into the last gen iPad Pro was pretty much unusable. I can definitely see myself making this my travel laptop, replacing my 2013 Macbook Air if I could connect my camera, import all of my photos for the day, maybe do some light editing and then export them to an external hard drive.

[1] https://paulstamatiou.com/made-on-an-ipad-pro/#photo-editing


Yes, on IOS 12 I regularly import very large RAW photos on my first generation iPad Pro. The RAW importing workflow is finally as good as the JPG, meaning it actually processes RAW in iOS photos. I'll also add that in some ways, such as editing levels, the iPad and my phone seem faster than a MacBook Pro. Whether this boils down to specialized hardware, more efficient software or something else I do not know.

I'll also add that the older iPad Pro hardware often goes on fire sale, and may do so more often now.


Unfortunately all the windows and mac versions of Adobe applications are extremely slow and bloated. Using Draw on the first gen iPad Pro to draw a portrait or something works extremely well and then saving that and importing it onto Illustrator on the mac is a tragedy, for example.


But you still have to import it into Photos first, then jump to the app that you actually want to work in? I wish Apple would open up some of these APIs so that apps like Photoshop could actually interact with the filesystem.


There's a workaround (albeit somewhat annoying) to this: After you've imported your photos to the Photos app, select them all and then export them to your destination of choice on the Files app. Then you can open them with whatever app you need to (or import them straight to the app folder itself if possible).


Out of curiosity, what difference does that make? If you can still work with and access those photos from within the app, what difference does it make if they get imported into Photos first vs. a file system?


I think you're going to have to wait to get an answer on this. The hope is that you can use a standard USB-C card reader to quickly import directly into a Files directory.

I import raw pictures all the time, but it's usually just a few pics. I don't think people realize how long it takes to import a days worth of pics or how you lose all your file names when you import the pictures.


A lot of stuff is moving to HEIC as an intermediary high quality format between JPEG and RAW. Though that being said between this and Photoshop on the iPad I'd assume that they're looking to support large RAW files for just this use case. Creative graphical applications seems to be a sweet spot for this kind of device.


I'm not sure I understand why someone would get frustrated about it taking over an hour to transfer 250gb over USB 2.0. That's...math?


I wish they made an iPhone SE as a miniature form of this iPad.

square-ish edges (easier to hold), better & larger LCD displays within the same body size, and powerful internals.

The new iphones are just too big for one-handed use. you want to type the name of the website on top bar of mobile Safari - bring the next hand out or drop your phone....


I wish they keep iPhone SE design as it is and just update the internals.


I'm glad I'm not the only one who really loves the SE design. I bought my first SE a few years ago, and now bought the very same model a few months ago (first one broke due to water damage).

It costs a lot less than the current models, supports all the functionality I currently need, and more importantly I can use it comfortably with one hand only.


> bring the next hand out or drop your phone....

This is not a great solution, but in "Accessibility Options" you can enable the gesture to swipe down on the bottom to bring the top of the screen down to the middle so you can reach it one handed. It's clunky but helpful.

It's the functionality that used to be on a double tap of the home button or something, and I thought it was removed, but it was just turned off by default.


The double tap has brought up the app switcher since iOS7. Maybe you're thinking of something else?


That's double press - double tap on my 6s still brings the screen down. Not sure if they kept that distinction when they moved to the not-really-a-button home button whenever that was though.


Yeah, I got really used to it! On the new devices without a home button it’s a little tricky to find that functionality since it’s disabled by default.


Agreed. I'm hoping as some of this newer tech is commoditized that economy of scale allows an updated smaller entry to return. I wouldn't mind paying X prices for such a beast, but I'm aware the market wouldn't support $500+.


Yeah, I'm actually not that picky about the tech inside. Give me last gen internals - A11 processor, prev. gen camera, no True tone/glued up display whatever to reduce cost. But, give me a phone that is the size of iPhone SE but with a bigger display, and decent enough camera - I'd line up in the store for that.


With you, that's why I switched to the new small Pixel (non-XL). I use my phone for a phone.

I can see if the phone is the only compute device you have, which is true for lots of people, then you might want to deal with the larger screen because you are watching videos on it and what not.


The iPhone SE makes a better synthesizer than the Pixel, though:-)

And it’s cheaper


This iPad just looks like a plus-sized version of the SE. Except with a camera bump for some godforsaken reason (who is taking pictures with their iPad Pro?).


I just wish Apple would put the internals of the iPad Pro into the Apple TV and bundle it with a decent controller. The gaming potential is massive, and they even compared it to the Xbox One during the keynote. They are squandering a massive market opportunity - I just can't work out why.


Here's the thing - it's not the hardware. iPad can be hugely powerful and it won't get good games until those games start paying revenue. As long as 4,99$ is the standard price for a game, you can't expect studios to just port the games for potential loss. Especially since iPad with its 799$+ price tag isn't really an affordable device in comparison to game consoles.

Compare this with Nintendo Switch - it's significantly less powerful than a few modern iPad generations. But it's also more affordable and Nintendo deliberately keeps the full-price game culture alive. Because of that, they have actual game library to offer to customers.


For a long time Apple's problem was a lack of software on their platform. I wonder if they have gone too far now and by encouraging so many developers they have ended up lowering the standards.

To me the huge amount of low cost junk in the App Store doesn't feel right next to the high quality expensive products Apple sells.


> To me the huge amount of low cost junk in the App Store doesn't feel right next to the high quality expensive products Apple sells.

It might have something to do with all the entitled customers (children?) that will review anything that costs five dollars or more as "a ripoff", unless maybe the game is literally Minecraft.


That's terribly unfair.

When the mobile ecosystem settles ~$5 as the price for full-featured apps, with .99 as the price for one-off toys and smaller tools, someone bucking the trend and charging multiple times that for apps that are not meaningfully higher quality than the rest of the charging-five-bucks ecosystem will rightly be judged by the average consumer as overpriced.

That's not "entitlement", that's violation of perfectly reasonable expectations. Don't blame or denigrate the customer for aligning their expectations to what the rest of the marketplace does.


Then make something that's worth 4-10x the standard app.

For me, Scrivener was worth the $20. Omnifocus was worth the $40. They added enough value and ability above what other apps in their category had (or paired well, in my case, with apps that I already had on my computer).

It is precisely entitlement that causes people to think that software isn't worth anything ("Code is just typing after all.") or is only worth $5 at most. Good software is worth it's cost if it adds value to your device and workflow or entertains you for a sufficient amount of time.


The switch does $5 games although


It does - but those are essentially crapware. All the big titles are in ~60$ range and people are used to paying so much for them.


Apple did that when Apple TV 4K came out: it has an A10X chip in it.


It lacks a controller, though. And the "best" controller is missing three inputs you'd expect (L3/R3 "stick click" buttons and a select button).


What's this missing besides the 'select' button: https://steelseries.com/gaming-controllers/nimbus


To be frank? An Apple logo.

Third-party stuff isn't sufficiently sticky. If Apple wants games on their platform, they need their own controller. It might even be best if it comes in the box.

I don't, however, think they really care that much.


It absolutely needs to come in the box for it to be effective. You need to be able to rely on 100% of your customer base having the hardware required to play it.

The thought that they don't care that much is really strange, because every single keynote that I can remember has included game demos. They always use game demos, from AAA game developers like Epic, to come out and show off the latest capabilities. They could take a sizeable chunk of PS4 and Xbox One customers very easily - fund the development of some genuinely AAA, exclusive games, bundle the controller in the box and the thing would sell like hotcakes. Nintendo is somewhat safe because their IP is much more valuable than Xbox or Sony, who basically offer the same games with a few notable exceptions.


I don't buy that the logo alone would do it. It would have to come bundled with a controller by default and all games would have to work with said controller.


Games compatibility is key. I had an Android phone I used years back with a PS3 controller and an external display to play Modern Combat 3 and it worked great as a game console... except every now and again the game forced you to swipe the screen to perform some quicktime action. Really unexpected when you're sitting 5 feet from the screen.


The controllers like the Steelseries Nimbus follow the Made for iPhone spec, which is why they're limited.


I want to say there's a new revision of the MFI controller spec, but nobody's put out hardware under it as of yet.


I don't think gamers necessarily expect an L3/R3 button. The Switch does not have anything like that, and Nintendo has always been top tier for gaming controllers.


Switch has L3/R3 buttons(you can click the analog sticks) and games do use them(sprint in Fortnite for example is L3).

Even the WiiU Pro controller had clickable sticks.


I've been tinkering with some commits to get the Moonlight project working on tvOS, and the biggest stumbling block to many of the games I could play with a controller is the lack of L3/R3 buttons. Sprint, look behind, crouch, take cover, melee, etc. all get mapped to these buttons pretty frequently, since your thumb is generally resting on at least on of those sticks during normal play.


Nintendo has historically terrible controllers which started with the N64 controller. That didn't end until the pro controller for the Wii U/Switch.

Even the switch joycons, which I think are incredibly impressive with how much tech is crammed into them, are pretty awful to actually control with. The only reason I think they get a break is because they are designed for portability.


Nintendo also has a long history of missing out on many cross-platform titles.


Did they really expect gaming to take off by not including a controller?


They compared an $800 tablet to a $250 gaming console. The A12x is pretty awesome, but its one of the dumbest comparisons Apple has ever made.


I haven't given it a great deal of thought, but I'd wager Apple doesn't want to risk developers prioritizing the controller/TV experience over touch/iPad.

Much like Apple feels that the touch interface is a different beast than trackpad/mouse, and combining them would be a mistake.


I’m enjoying plenty of gaming on my Apple TV, so unless we are talking hardcore AAA titles, there is plenty to keep yourself entertained.

It is a far less frustrating game console to use than PS/XBox. It’s actually really nice.

I bought a controller for $50 to play Minecraft. It’s not bundled (that’s a good thing).


Well, good thing you got your Minecraft in time, because they're dropping support for the Apple TV.


Wouldn't that be the Mac Mini?


Seems like the market is going in the direction of streaming (think 5-10+ years in the future). Having a powerful console is becoming less and less important.


That's a good question. Do you think Steve's opinion on gaming is still baked into the company?


Steve Jobs wasn't anti-gaming. The only moment he was anti-gaming was when he was trying to sell $2500 Macintoshes and battling IBM PC's, when IBM was applying FUD against Apple, calling their computers "toys".


Huh looks like most of the predictions I'd heard came true: USB-C and no home button. Wonder if they're going to do the same shift on the rest of the lines? I'm a little surprised that the iPhone Xs and Xr weren't switched over to USB-c if this is planned for the whole line. Charging the pencil attached to the side is pretty neat the old solution was very... odd and seemed pretty dangerous (having such a long lever arm on the relatively small lightning port).


USB-C is considerably bigger than Lightning while offering arguably no major benefit for an almost entirely wireless device.

Think size doesn’t really matter that much? The iPhone XR actually pushes the lightning port down slightly off center to make room for the LCD display. The XR would literally be impossible to make with USB-C.


The obvious benefit for adopting USB-C is compatibility with other devices. It seems strange to me that almost all new mobile devices from almost all manufacturers (including some from apple) are converging on a single port while apple's best-selling product uses an incompatible port. I get there are manufacturing issues, but I think eventually they'll have to be solved somehow. Maybe no charging port and just rely on wireless charging?


> The obvious benefit for adopting USB-C is compatibility with other devices.

Not an obvious benefit for a largely wireless device. For the vast vast majority of people it’s only used for charging and that’s moving to wireless too.


The obvious benefit is not need a specific cable just for my iPhone. I am slowly getting more USB C cables over the last year - it would be nice if I could replace all my lightning cables, which are only used for my iPhone and iPad. It's a small benefit, but it would be nice.


You don't need a specific cable for your iPhone. You can charge it wirelessly.

Regardless, USB-C would not be nicer than keeping the iPhone XR as thin as it is. Lightning really makes it possible.


the original pixel is 0.2milimeters taller, 2 years older, and has a USB-C port.

I was talking with an apple hardware engineer... I'm told that the jack is in some ways a limiting factor, though I'd happily take an extra .2mm of depth for universal charging :-)


That's because the Pixel uses an AMOLED display. The iPhone XR uses LED which is cheaper but thicker. My point stands: Lightning's space savings are essential to the iPhone XR.

Also, the iPhones do support universal charging (Qi).


I also was really hoping the new iPhones would finally make the switch to USB-C; it's ridiculous for them to continue putting Lightning ports on anything at this point, considering how they've gone all-in on USB-C on their Macbooks (and now iPad Pros).

Hopefully next year's models will finally make the switch!


This, exactly this. Completely goes against their marketing pitch that X-family devices are "future-proof". They just contradicted that entire line of thought by adding USB-C to iPads. Don't get me wrong, I'm glad they did! But pissed that my XS is still using Lightning.


> old solution was very... odd and seemed pretty dangerous

The old solution, isn't as bad as the media portrayed it.

First, it comes with a socket to socket piece, so if you are charging at home, it's absolutely no problem.

Second, you are only doing that in "emergency" cases where you need it, but forget to charge at home.

No complaints here.


I don't know about the emergency that's pretty much always how I wind up charging my Apple Pencil. It's fast enough that it's way more convenient than finding that tiny adapter.


It's also faster than charging via the adapter, for whatever reason.


Apple Pencil is more durable than you’d think. Normal jostling while it’s charging doesn’t really affect it. With regards to USB C, I think they’re trying it out on the new iPads because these are ostensibly their “pro” iOS devices, so it makes sense to put higher-throughput ports on them first.


I'm less worried about the pencil than I am the port/connector. The pencil is pretty long and there's not much slop back front to back in that port to deal with jostling.


The end of the pencil, near the connector, has some flex in it.


I'm almost certain that the next iPhone will use USB-C. They just didn't want to do that so soon after getting rid of the headphone jack and promoting the idea of lightning headphones.


Really that was the time to make the switch to USB-C. Now some (I have no idea of the number though) number of people have gone out and bought lightning headphones.


I think the next charging revision for the iPhone will be no charging port at all, like the Apple Watch.

But what about headphones? Also wireless.


AirPods are way more expensive that EarPods. If they included them in every iPhone it’d add at least $100 to the price.


I think the thought is to position the iPad mor like the Macs. Macs come with USB-C, too, after all.

So the iPad becomes another device to plug your phone into, so to speak (if only to charge at first).


Apple and failed proprietary connectors, name a more iconic duo.


I've never been upset enough from something I've read in a HN article to comment about it,

but the audacity to talk about intuitive design and then fucking make my vertical scroll wheel scroll the page horizontally is outrageous.


It's such a shame that there's no thunderbolt 3 in the iPad Pro. I would love to ditch my MBP to just plug an iPad into a TB3 monitor and get a mouse, keyboard, drawing tablet, and charging with 1 wire. I can get close to that with a MBP. The only downsides are a MBP is heavier for mobile use and I'd have to buy an expensive tablet.


The USB-C spec supports that, you don't need TB3 for these features. Of course I don't know whether Apple actually implemented all of that. I have a WQHD USB-C monitor (Lenovo P24h) with 45W power delivery, a USB 3 hub, and audio out at home, and it does this without TB3. So it might also be possible with this new iPad Pro.


You're right. I was working under the assumption that Apple would have just went for TB3 if they supported those features. I'm not going to claim that I know too much about the crazy USB-C/TB3 specs though.


I’m not clear what USB-C support will be like yet, but is it possible that will work anyway? The dongles and docks that are available for other USB-C devices (e.g. https://www.owcdigital.com/products/usb-c-dock and similar) might work.


They seemed to show the iPad Pro getting plugged into what appeared to my eye like an LG ultrafine display using USB-C and using it as a display, I'm guessing you could plug a keyboard into one of the USB-C ports in the back of the ultrafine display (or a BT). Obviously not a mouse, and I'd be surprised if a drawing tablet would work.


As someone researching drawing tablets, am curious why you would plug in additional drawing tablet instead of just drawing on the iPad?


My comment was probably unclear. I would only do that with along side my MBP, the iPad itself would be the drawing tablet. I don't currently use a tablet though.


Do iPads support mouse (trackpads) yet? That would be huge.


Nope. Apple would need to do a pretty decent amount of work on iOS to support desktop like behavior. I'm not sure if there's much sales incentive either. Which is a shame because Apple got where it is by pushing the best possible user experiences. Not blind sales maximization.


Surely getting iOS apps working on macOS (Marzipan) is laying some of this ground, they are having to adapt iOS UI widgets for it, what would it take to feed this back to iOS?


I see no real reason to feed it back to iOS. If Apple manages to integrate iOS apps into macOS well, I can see them shipping iPads/iPhones with i+macOS. Both OS's run the same kernel anyways so why not just tweak them a bit, glue the environments together, and have one run a superset of apps?

Apple's superior UX team and control over app distribution makes them much more likely to succeed than when MS tried to go down a similar road. Even their long-term investment in LLVM helps here. One can only hope us consumers don't end up with a more locked down macOS as a result.


Am I going crazy? Didn't they support mice pointers when the iPad originally launched?


No, mice were never supported.


There was a jailbreak hack that added bluetooth mouse and pointer support to iOS. With another hack that added multi-window support you could turn an iPad into a buggy and slow version of macOS


Maybe you're thinking of the iOS Simulator on macOS?


Exactly this is needed as well as being able to run full XCode and VS Code


Slightly wider aspect ratio... 1.43 close to the square root of 2, isn't that new for iPads? They were 4:3 for a while.

One of the things I think this industry gets backwards is aspect ratios. I primarily use my tablet to watch movies/tv and 99% of them are squarish. I primarily use my laptop to work, and almost all of them are now 16:9. PCs anyway.


Just looking at those product photos I am pretty confused. How is that a display that "goes from edge to edge"? There is bevel on each side...


It’s much smaller than the previous generation, but yes, it’s kind of a marketing conceit.


They mean glass


It's a marketing conceit. The glass has gone edge to edge since the original iPad.


Typing from most recent iPad Pro.. no, glass stops and edge is metal, you can catch a nail or pencil nib in the groove before the edge.


Do you all use iPad for serious work? Curious what apps you use?

Other than Apple Notes(with Apple Pencil), and watching NetFlix, I dont make good use of it. I'd love to get some recommendation for useful apps to try...


The key feature that drove me to buy the 12.9" iPad Pro last year was the ability to read and mark up PDFs without having to constantly zoom/move it around. Old eyes.

I also use it to create routes for road trips for my Jeep club: it's really nice to have my GPS app and a satellite view side by side as I look for interesting back roads.

In short: I bought a Ferrari and use it like a Civic.


Would you mind elaborating a little on your workflow / apps used here please?


There's really not much to it.

I keep computer science/math-related PDFs in Dropbox, copy them to GoodReader (I've started experimenting a bit with other PDF apps, but GoodReader has been my go to for years now), and occasionally use Apple Pencil to mark them up as I read them. I don't save them back to Dropbox for future review; GoodReader is effectively its own island in that regard.

For GPS, I use Gaia GPS to create routes, often with Apple Maps and/or Google Maps running in split screen.

If you have specific questions, I can address them, but none of it is particularly formal.


Full-time PhD student, I use it for pretty much everything besides running actual simulations. It’s the only thing I take to the office.

LiquidText for reading books and articles, Notes Plus for note taking. Both are a little more finicky than other options, but they each have unique features I find indispensable. (If you haven’t completed a handwritten problem set with lasso-and-drag and scribble-to-erase, you don’t know what you’re missing.)

I do most of my writing straight into Google Docs on the Apple keyboard cover. I also have a LaTeX app and a Jupyter console (Juno) that see some use when I’m away from home, but I still prefer to work on that stuff at a PC.

Edit: Oh, and outside of work, I use GarageBand with a Lightning stereo condenser mic to record music. I doubt if anyone with real production skills would enjoy it, but it works perfectly for my purposes.


Scrivener and Goodreader. The latter stores a lot of PDFs and such for me. The former works out pretty well for working on documents while I'm out and about (versus taking my laptop). A nice feature of Scrivener is that you can attach files as well. So it's turned out to be a great study tool (that split-screen apps kind of eliminate the need for) because I can open a PDF of a text and have it adjacent to my notes on that section or the part of the paper I'm writing that's relevant to the PDF.

My secondary use case is board games. Carcassonne and others have pretty good ports which was very handy for playing with people while traveling or instead of carting a bunch of boxes (or bags) over to friends' homes. Everyone but me has gotten married and had kids so gaming is much less frequent these days, and usually hosted by the guy who has the biggest board game collection when we do gather.

My final (present) use case is tabletop roleplaying games, but that's also covered by my current device. Hero Lab has an iPad port which was handy for some games I ran in the past, and hope to run again in the future.


I write on Pages. The Shortcuts app is a great way to sutomste time tracking or anything else: toggl is good in conjunction with this.

Affinity designer is a great vector program. Lumafusion is good for editing, though I havem’t used it much. Ferrite is good for audio. Most areas have a few standout apps, depends what you want to do.

Explain everything is good for making educational videos.


Most of my team at a major financial switching from Surface to iPad Pro as daily driver.

Killer app is (a) ultra portability with genuinely all day battery and LTE for meetings and commute with (b) every tool an office user uses.

Some if us also code (servers and infra not desktop or mobile apps), works great for ‘Git Ops’ workflows.


I recently wrote an article[0] about how I use my iPad. A few of my favorites are: I write in Apple Notes and Working Copy[1] (for markdown). I wireframe and do basic layout design with Keynote. Keynote is surprisingly good for this, especially because I can collaborate easily with other team members. I also use Procreate[2] for sketching. I do some server management (mainly connecting to a RaspberryPi I carry) using Blink[3].

---

[0] https://medium.com/@headquartershq/how-to-use-your-ipad-to-d... (fyi: there are some links to products we carry in this article, all app links are non-affiliate.)

[1] https://itunes.apple.com/us/app/working-copy/id896694807

[2] https://itunes.apple.com/us/app/procreate/id425073498

[3] https://itunes.apple.com/us/app/blink-shell-mosh-ssh/id11567...


Im comfortable plugging a USB keyboard into it and SSHing into a remote machine to do dev work.

I have a MacBook Pro as well now, so I hardly ever use the iPad Pro for that sort of thing anymore but it was a pretty good experience.

It’s also an excellent music creation device. I originally bought it so I could use apps like Korg Module and Gadget and it’s great for that purpose.

It’s also my favorite device for reading kindle textbooks.


Two of my children do most of their schoolwork on iPads now. They're 8th grade & freshman in college — one we got for drawing (2017 iPad Pro) and the other to view college textbooks (2018 basic iPad), and both quickly became primary devices because they're good enough for web and GSuite — and we got keyboard cases & Apple Pencil for each.


It's great in the creative industry (like painting) where you need to look at reference photos. I leave it on my desk to look at. Anything cheaper would do I suppose, but I prefer to go all out :p


I mostly use it for displaying sheet music / band charts, and occasionally as a Presonus Studio One remote control.

I think I would like to move to using Adobe Lightroom on the iPad.


What do you use for sheet music? I use ForScore with the Pencil, and it has been an amazing pairing that a magnet-mounted Pencil will only make better. One of my bandmates uses an Android tablet and generic stylus, and it's not even comparable how much better my setup is. I'm sure there are a dozen Android tablets with active styli, but I haven't seen them.


As a band leader, I have produced score material using Sibelius for years, and have been using Avid Scorch Sibelius file viewer on the iPad. Unfortunately, Avid seems not especially interested in supporting this product... this is a case where I can really see practical value in Free Software! I would love to be able to contribute some fixes and improvements.

But it works well enough.

I am looking forward to the Steinberg Dorico team putting out a MusicXML viewer for iPad... and have also occasionally contemplated developing one myself, to make sure I get the feature set I want. :-)

[There are good PDF-based systems, but we sometimes need to switch keys for vocalists, and MusicXML works better for that, if you don't know in advance to prepare a new document.]


All of my scores are PDFs scans of actual paper music I have (and own, copyright, etc), so ForScore being a glorified PDF viewer has been more than fine for me. It's super cool to hear about alternative uses, especially around more dynamic music notation, like MusicXML (of which I know nothing).

I've done iOS dev in the past (it's not my daytime gig at the moment), so if your contemplations ever get more serious and you want an enthusiastic partner, hit me up. I'd love to do something musical.


Contract management! Reading / searching / bookmarking sections works excellently on this, it just seems more natural than when using a normal laptop. More like reading a (significantly larger) paper copy of a contract.


I've used it at conferences and occasionally out and about when I need a machine with access to tools (browser, Slack, etc) but I know I won't be doing any development.


I really like OmniGraffle. It's got a free trial, but the trial is short enough that I'd wait to start it until you can spend a little time kicking the tires.


Finally a mobile device that uses USB-C to charge.

I'm glad they fixed the awful UX on charging the pencil. Also stoked there's no notch in the screen, but sad that there's rounded corners.

Still no ability to let people make apps for the iPad on the iPad, which I think is truly disappointing.

*edit: I just realized it doesn't have a headphone jack and does have a camera bump. Booooooo


> Finally a mobile device that uses USB-C to charge.

Not sure if you're being sarcastic, but don't all (most) mobile devices use USB-C to charge these days?


They're talking specifically about mobile devices from apple.


They do. As do Macbooks too (which is actually rather annoying when you only have two USB-C ports and need to plug other stuff in besides a charger). iPhones, however, are still using their dinky lightning cable. So I'm assuming the GP was talking specifically about iPhones rather than "mobile devices" in the more general sense.


Sorry, I meant an Apple mobile device.


I guess that then depends if you’re also not classing MacBooks as mobile devices because Apple does count their laptops when publishing their mobile device sales figures.


If by "all (most)" you mean some, then yes, exactly right.


I just don't understand why they would release all-new phones literally 2 months ago with Lightning ports, but then release iPads with USB-C. Makes no sense, especially with how they market the X-family as "future-proof". Clearly they aren't. They should have gone with USB-C all around.


It's likely that Lightning connectors will be used for phones and other devices where "slimness" is important, since the Lightning connector is thinner.

USB-C will likely be a thing on "pro" devices where users are expected to create more than consume, and because USB-C can interface with more hardware.


Then they need to start shipping iPhones with Lightning to USB-C cables and swap out the port on the wall chargers. There needs to be some bridge between the two categories of devices that they currently are not providing.

This should come in the box with every iPhone: https://www.apple.com/shop/product/MQGJ2AM/A/usb-c-to-lightn...


I would never be able to plug that in correctly in the dark on the first try.


Can you elaborate? Would it not be easier than plugging in a USB-A cord in the dark, seeing that USB-C is reversible and USB-A is not? Is it because both ends would feel more similar?


You can tell which end is which by touch alone with USB-A (since the USB-A connector is so much larger than the Lightning connector).


Touch wise the connectors feel different. I think you would


It should be taken as an indication that Apple views the iPad Pro as the bottom end of their laptop lineup, which is now standardized on USB C.


Very good point - that makes more sense as to why they would do this. Nonetheless frustrating IMO for anyone who uses the whole line of devices together.


Adding to the other replies, the Phones are also on an S release, which means the form factor wouldn’t have changed much this year. If USB-C arrives for phones it will be in next years release. THe phones also have a shorter shelf life than the iPads, so it’s not as big a deal if it isn’t in the phones yet


This is a new for apple. My iPad Pro 10.5 is continuing to appreciate, instead of depreciate. They keep raising the price. Fist from $879 to $899. Now it's $929. What is going on at Apple where older products cost more?

https://imgur.com/a/JB7fR4a


They have the same name but are you sure they are the same spec?


They are exactly the same.


Increase in costs due to tariffs on Chinese imports?


I love my iPad Pro so much, this is a really great upgrade IMO. I'm sure I'll be planning on a way to upgrade. Only downside is that I prefer touchId to FaceId but I'm mostly switching back to a passcode in any case.


> I love my iPad Pro so much,

Why? What is your use case? (Genuinely curious.)


I use it for nearly everything, it's simply the most convenient and fast computer device I own and I have far too many. Examples:

-- Art: Apple pencil and procreate are the best drawing platform on the market IMO, clips studio on iPad is really close. I'm not a professional but I have tried and owned Wacom's (including cintques) and a Microsoft surface book. The iPad pro's screen and drawing experience is far better IMO.

-- Music: I used to play guitar and still mess around a bit with it, also I've started programming electronic music. Garage Band, DJay Pro 2, and Bias Amp 2 are really amazing pieces of kit. The touch interface for Bias Amp 2 in particular is amazing and feels like using a real guitar amp.

-- Gaming: Card and Board games, Pandemic, Star Realms, Hearthstone, etc.... are absolutely killer on the big 12.9 inch screen. Civ 6 is also an amazing game on the iPad. Along with a ton of AR based games.

-- Reading: iBooks and Kindle

-- Of course all of the standard social media and communications apps. I use the Logitech smart connect keyboard and it's very nice.

-- Software Development: I'm an iOS dev and I use Swift playgrounds on iPad a ton to sketch ideas. When I do webdev I'll often use the iPad with Coda and a remote server and it's a great experience.


How do you like Coda? I’ve been a Transmit user forever, so I’m willing to give them the benefit of the doubt.

With support for the lg 4/5ks I could actually see myself going all in on an iPad Pro for a home setup.


Coda's tabbed approach took me a bit to really like but the app is really top notch in quality and functionality. Transmit, prompt(their excellent ssh client), and a really good editor are built into Coda.


You should look into Korg Gadget, it’s my favorite music creation app for the iPad.


Rad, I'll check it out. Thanks!


Not OP, but with the same feeling.

It is simply the best media consumption device I have ever owned.

I bought an iPad 4 at the time (I later upgraded it to an Air and then a Pro) and left it almost untouched for months, then I started using it for web browsing, Youtube, Netflix, Twitch, e-books, light gaming... over time it has replaced not only my PC/laptop, but also my TV, for many activities.

I do not use it for anything professional-related (maybe reviewing some document when on a plane, or an occasional ssh connection to a remote console), but the media consumption part is enough for me.


The fixed-width bezel makes it look great. I like it.

I don't know why the pencil didn't come with magnetic attachment from the get go, it seems like such a simple change that makes a big difference.

The cynic in me thinks that it was deliberately withheld in v1 so that a big deal about its addition could be made in v2.


Back then Apple didn't have much experience with wireless charging. They also didn't have any place to "dock" the pencil to the iPad, even though other companies were using magnets. So that seemed like an explicit decision at the time to me.

I saw a video where artists were demoing the iPad Pro Photoshop. Most of them had experience with the Wacom stylus and criticized that the Apple Pencil was too slim and cramped their hand. The obvious solution would be to add a rubber grip--which wouldn't really work with this storage/charging solution.


> I don't know why the pencil didn't come with magnetic attachment from the get go, it seems like such a simple change that makes a big difference.

The sides of iPad used to be rounded, so it wouldn’t really stick.


Could have made the Pencil triangular. The new shape fixes yet another annoying ergonomic issue with the original - that it's really hard to hold comfortably for extended period of time. There's a reason pencils are hexagon, and pens have rubber grips.


Still waiting on Xcode for the iPad "Pro"


Same. I was hoping we would see that.

The Swift Playgrounds app really hasn’t gotten any love[1] since it was first introduced. You can actually import UIKit and the other frameworks and compile them on the iPad but this is pretty much undocumentated. The preview views are tremendously buggy (really the whole app is) and there is all sorts of hidden debug stuff running in the background[2] that slows down performance. Everything has to be in a single file so doing anything moderately complicated is a pain, the editor is buggy and will sometimes highlight just the last three letters of a word, and there is no Interface Builder[3]. I would love a real XCode.

[1]There have been new playgrounds added (e.g. “Learn to Code with Swift”) but the app itself really hasn’t been changed at all. The same bugs have been present for years. And they aren’t obscur either. You will run into editor bugs within the first 5 minutes.

[2] The debug stuff is to help facilitate the different playgrounds, but doesn’t really provide any benefit to someone coding in a blank file outside of a playground. You can’t turn it off within the app.

[3] I know a lot of people don’t like IB, but it’s great for throwing together simple apps quickly. It’s odd that it is absent from the official apple App but Pythonista, an app made by one person, has an visual interface editor.


> The Swift Playgrounds app really hasn’t gotten any love[1] since it was first introduced.

The UI has been changed slightly.

> You can actually import UIKit and the other frameworks and compile them on the iPad but this is pretty much undocumentated.

Well, it's UIKit. You can just pull up the documentation at developer.apple.com.

> The preview views are tremendously buggy (really the whole app is) and there is all sorts of hidden debug stuff running in the background[2] that slows down performance.

The app is very buggy, but the debug stuff cannot be turned off because it's compiled into the code you write.

> Everything has to be in a single file so doing anything moderately complicated is a pain

No: you can add additional files to the "Sources" folder. This is how Apple's tutorial playgrounds work.


When I say it’s in undocumented I mean that the fact that you can use UIkit at all is undocumented. You can use it but you are never told you can. Also there is little documentation that explains things like XCPlayground and PlaygroundsSupport. You kind of have to adapt them from the Xcode Playgrounds for the Mac but the fact that you can do this is never mentioned. Also the limitations on Swift Playgrounds are never mentioned anywhere (e.g. autolayout constraints don’t really work for the Playgrounds previews.

Also, my complaint was that you cannot turn of the debug stuff. You actually can if you turn off things like logging in a template file on the Mac and then move that to the ipad. At that point you might as well just use the Mac though. See here for what I’m talking about. http://saltpigmedia.com/blog/Modules-in-Swift-Playgrounds

Also, as far as I am aware, you cannot add to the sources folder on the iPad itself. That requires a Mac.

My comments were in regard to using Swift Playgrounds as a pseudo-Xcode. It does indeed compile swift code and UIKit, so a lot of that groundwork is there. And I would love to use the iPad to do real coding, and Swift Playgrounds is real tempting, but it just doesn’t work and I find that disappointing.


> When I say it’s in undocumented I mean that the fact that you can use UIkit at all is undocumented.

A couple of the example playgrounds expose this.

> Also there is little documentation that explains things like XCPlayground and PlaygroundsSupport.

Documentation is available here: https://developer.apple.com/documentation/playgroundsupport. It's not great, but it's getting better.

> Also the limitations on Swift Playgrounds are never mentioned anywhere (e.g. autolayout constraints don’t really work for the Playgrounds previews.

They do, but in a buggy way. I have long considered this to be a defect and have filed Radars about this.

> Also, as far as I am aware, you cannot add to the sources folder on the iPad itself. That requires a Mac.

You can create playgrounds on the fly with the Files app, if you know the structure they must have. You cannot edit them in Swift Playgrounds.


Is it just me or are Apple's chips and architecture just blowing anything Intel in the Macbooks (Pro, Air, etc) out of the water?

As an aside this iPad Pro will replace our home computer in the next month. It has as many USB-C ports as my 2016 Macbook, a better screen and a way better graphics card. Plus the ability to use it with or without a keyboard is amazing.


I'm really impressed that they made a number of environmental friendly additions, though they haven't been played up as much:

- LED display is mercury-free - No arsenic - No BFR or PVC etc.


This has been true of most of Apple's products for years. Generally they have a slide that summarizes this after every product announcement.


LED-backlit displays have always been mercury-free, it's the "ancient" CFL-backlit displays that needed mercury.


This is unchanged from the previous model (and probably model before that, Apple has been at it a while).


I see it still has the stupid iOS home page where there are just 5 icons spread out horizontally when the iPad is in landscape orientation.

The new apple pencil is interesting. Now without the seamless round body, I wonder if the tip will wear out faster, given user will now always hold it the same way.


Can someone more educated on the subject please explain to me what would be required to turn this into a proper development machine? I'm asking about what iOS changes would be needed, rather than opinions about the physical aspects of it like form factor.


Depends on what you want to output. Webapps? Nothing, it already has all you need, especially if you are willing to use a remote server that you are ssh'ed into but you don't NEED that. For native compiled code and apps? Apple needs to release a Xcode compiler and a developer mode that lets you run self-signed binaries on the hardware.


Is there an inspector in Safari for iOS? Can I dig into the markup on a page and debug CSS and layout issues?


I do this today on my 6th gen iPad. I use MOSH, a BlueTooth mechanical keyboard, and Blink as my client. It works well.


If you develop anything for the web, don't you need to check how it renders on other browsers?


I don't do a ton of web front end work and what front end work I do is pretty simple. I'm 99% a native mobile developer and IoT C++/Python developer. Usually mobile safari is good enough for me.


Safari is usually good enough for what I do. I can use BrowserStack if need be, too.


Tangential, but it's worth noting that if what you are looking for is a portable development machine with tablet-like capatibilities, the Microsoft Surface series is arguably a better choice.


Or Google Pixelbook, if you prefer Linux over Windows.


You can program on an iPad already using interpreted languages such as Python and Lua. And you can SSH into other machines, run Git, and have a text editor already.

The main issue with developing on an iPad is that it’s not easy to load unsigned, compiled code on it, and even if you do, there’s not much that the code can do because of the system sandbox.


I would have thought the main issue would be screen size. Don't developers like large screen real estate, being able to see documents side by side, or the code of your app, a browser rendering it and the debug tools inside the browser, and stackoverflow on the side?


It depends, mostly on preference, but also eyesight (and therefore, age).

My main dev machine is a 12” MBP. I have external monitors, but do quite a bit of work even on my 9.6” iPad. I spend 9% of my time in a terminal anyhow, and have good eyesight.


The parent comment asked to put aside hardware issues, so I left this out.


I've been looking into this as well. Here's some notes: https://www.notion.so/csytan/iPad-as-a-Development-Device-99...


The main issue in my opinion is the lack of mouse support. Not saying the OS needs to support mice throughout the UI but it would be great to be able to use one just with a remote desktop app, for instance.


You could use something like codeanywhere.com though from my experience we're still X years away from running IDEs in the cloud seamlessly


Depending on your environment you may be able to do it now. The rise of cloud coding platforms means that coding on an iPad is pretty useable as long as your deployment process works with it.


I really like where the iPad is going. I play Civ 6 on a Pro with an Apple Pencil all the time, and I wish the screen was bigger. It is now, but that price... I could get a gaming rig for the cost of a 12in. My only question is where is the mini with FaceID?


It's interesting the mention the Xbox, when it's obvious they want to compete with Nintendo.

Apple is quickly realizing that they missed a major market opportunity by not designing snap on controllers that convert iPads into Nintendo Switches. Touch works only for a subset of games, but a first-party controller would have absolutely destroyed the market.


Absolutely agree! The iPad is already my main gaming machine but it's also 2-4x more expensive than a console. At which point, does it make sense as a gaming platform for anyone not making an above average income?


> I really like where the iPad is going.

You mean becoming more and more like a Surface Pro?


I've owned both and the iPad Pro was better for me in almost every way. The main issue with the Surface Pro is the horrid windows software eco-system, especially around touch interfaces. Almost without exception windows touch apps are terribly designed.


This is what I am hoping for. I want to have my tablet mode with iOS apps and interface. Then I want to switch into laptop/desktop mode, being able to spin up any desktop application.

Even more a dream state: would want a docking station that allows me to use my multi monitor setup.


The mini with FaceID is the iPhone XS Max, more or less.


Less... a bezel-less mini could be 2 inches larger and have much more battery life. I pretty much only use my iPad mini at home for reading and browsing, so it doesn't need cellular nor the world's best camera.


Agree on the size, but that price...


I found the announcement an interesting mix, more like the original iPad where there is potential in the hardware that isn't available in the software.

The increasingly similar optics to the Surface Pro are amusing as well, design students would do well to read the coverage of these two systems from introduction to the present day. A lot to be learned there in terms of understanding how systems like these evolve.

I am bummed that the pencil isn't compatible and while you get better recharging and 'tap to change tools' and some tilt and pressure sensitivity, but how much? Actual specs seem to be elusive.


It is impressive but an overkill imho, going into desktops’ territory. I would have bought a refreshed iPad mini instead.


Certainly not "desktop territory" (at least, 2018 desktop), but its definitely in "laptop territory" now, which is a major advancement.

Desktops are 8-fat-core behemoths in 2018 with incredible potential. Ryzen 7 2700x == 8-core / 16 threads. Intel i7-9700k isn't even the top-end anymore and is also 8-core /(only 8-threads). And its actually reasonable to build a 64-GB of RAM Consumer Desktop (Ryzen 7 2700x or Intel i7-9700k).

------

iPad Pro may be 8-core... but its configuration is 4-fast core + 4-low-power Core, still building itself as a way to reduce power consumption at the cost of speed / performance.

4GB of RAM is also pitiful and won't be sufficient for desktop-level tasks.

Desktops can run 8-fat cores all day long due to their wall-power connection. iPads are still designed for mobility and low-power consumption. Its a good compromise, but I wouldn't bill it as anything more than a laptop-class system.


PC Gaming has been the driving force behind recent PC Shipment improvement. And I see iPad Pro, with a Magic Keyboard and Mouse as opportunity that could finally disrupt the PC Gaming business. Or Gaming that requires Keyboard and mouse. PUBG, C&C, Civ, Diablo, Online Gaming. The iPad Pro could now run all these with an external monitor.

Basically it is Apple and iOS that is limiting the iPad Pro Usage. Give me a BT 5.0 Keyboard and Mouse, along with a Sublime Editor I could even replace my MacBook Pro.


Wouldn't it make sense to use the iPad Pro as eGPU and second monitor for a MacBook Pro/Air?


Duet display does this really well with my current iPad from a MBP.


This image showcasing the thinness of the iPad looks perfectly flat: https://www.apple.com/v/ipad-pro/p/images/overview/portable_...

However in the first image showing the iPad from the side there's a clear bulge, I'm assuming for the camera: https://www.apple.com/v/ipad-pro/p/images/overview/hero__b2q...

So is it flat or not? It might seem like a small nitpick but I hate this trend of having a bulge for the camera just so that you can say that the rest of the assembly is like 2mm thinner. Who cares? Now you have something that looks and feels worse and you can't put it flat on its back.

I hope they at least engineered it in a way that makes the tablet stable when lying down on a flat surface instead of wobbling on the pivot of the camera bulge.

EDIT: Actually I just noticed that the bulge seems to be here in the first image, it's just hidden in the palm of the hand. Good job Apple marketers, bad job apple designers.


See also: any phone with a notch where the notch conveniently hides in a portion of the wallpaper/displayed promo oil image that everyone's using that is black:

https://i.imgur.com/zvSr8pf.png


The base 12-inch model is $999.

The Pencil ($129) and Smart Keyboard Folio ($199) are not included.

At $1,327, how does it compare specs-wise with the new 13" MacBook Air?


Still can't plug in an external USB hard drive? That's a deal-breaker.

Any "pro" device needs to allow wired external storage and basic access to files for transferring to and from storage devices. That's just common sense.

To get my video files and large files, for example from my computer to the iPad, I need to go through iTunes or some annoying and slow wi-fi process, which is needlessly limiting.

In regards to the keynote, the constant applause and cheers from the audience is back, worse than ever. Why would the audience applaud "amount of units sold worldwide"? Just because Tim Cook ramps up his voice and prompts for applause at every bullet point in his powerpoint, doesn't mean the audience should oblige like well-behaved school children.


Swiping all the way down the screen to unlock with FaceID is going to be a big burden on the Ipad Pro. Less of a burden on the iPhone series for obvious reasons but really? Why not let people swipe up anywhere on the screen to unlock FaceID.


Don’t you swipe up to unlock with Face ID?


Yea i meant having to swipe up from the bottom of the screen


Agreed. I’ve gotten somewhat used to it now, but I preferred the old gestures from iOS 11 on the iPad over the new iPhone X style gestures in iOS 12.


Away from work, I use my iPad for most things and only use my laptop for writing code. As famous hacker Kevin Mitnick says in his new book “The Art of Invisibility”, using locked down devices like iPads and Chromebooks provides the most safety.

I like the jump to USB-C and I am curious what the ‘docking’ experience is when the new iPad Pro is plugged into a large USB-C compatible monitor. This might make the developer experience very good and open the door to making something like the new iPad Pro in the future be the only computing device required. Add LTE and a data plan for mobility and traveling lite.


Do the new iPads all have giant sticking-out camera lenses in one corner?

It's already bad enough that my iPhone can't sit flat on a table because of the camera lens. It would be super annoying in an iPad!


One thing I was missing from the presentation: Can it finally connect both a keyboard and a pencil, i.e. does it have magnets on both sides?


The keyboard looks to attach the the left side in these pics: https://www.apple.com/smart-keyboard/

While "the new Apple Pencil starts charging when you place the flat part on the right side of your iPad Pro." from: https://www.apple.com/apple-pencil/

So it looks like the answer is yes.


There was a passing screenshot in the video with both used at the same time. This was my biggest question towards the end, and I wish I'd screenshotted when I saw it. But they do look to both work together, and glad the documentation lines up with that.


I seem to recall seeing a 2 second thing where they had a pencil and keyboard attached at the same time. And I'm pretty sure they mentioned 192 magnets built into the edges of the Pro.

I'm hoping this leads to a future where 3rd party accessories snap on and charge.


I guess it doubles up as a harddisk eraser then...


They mentioned that they have magnets all over the thing.


USB-C port. Great!


As a heavy iPad Pro user, the number one thing I’m looking forward to is magnetic pencil charging. With kids around, the old way of sticking it into lightening port is just asking for trouble — and I never have the dongle around for cable charging when I needed it. The grooved shape of the pencil looks like an ergonomic improvement.


How do I connect my headphone cable to this?


Dongle!


Who are the people out there that use the rear facing camera of a 13 inch tablet enough to justify that camera bump? It isn't unusual for Apple to make unpopular design choices, but you at least can usually see the logic in their decisions. I can't fathom the reasons behind this one.


By my count, everyone's parents over the age of 50.


My mom still carries around a terrible 10 year old point and shoot because I can't convince her that her iphone camera is far superior.


Love how the frame switched back to the iPhone 4's style. Feels retro and futuristic at the same time.


Wishing the iPad mini form factor would get some love, or at least a version that works with the Pencil


For me this was magical. I love the smaller 12.9”. The size seems just right and I’m sure that’s the one I’ll be getting to replace my 15” MBP for everything portable like travel.

This looks like the perfect tablet and I’m not sure what else they could do to make it better going forward


I still hold my doubts that this could even nearly replace a traditional machine in a real professional's workflow. If anyone has real-world performance comparisons, I'd be interested to see... But the iPads still feel like an expensive toy to me.


FaceID makes a lot of sense for the iPad, arguably more than the iPhone. What I like about my TouchID is being able to unlock my phone in my pocket or as I'm sliding it out. I'm never going to do that with an iPad.


The 11" cellular model weighs the same as the 11" wifi model. But the 12.9" cellular model weighs 2 grams more than the 12.9" wifi model.

Anyone know why this would be? Obviously not a big deal — just curiosity.


While i have no way of actually knowing, my assumption is battery. They have historically juiced up the battery on cellular models.


No, iPad batteries have always been the same on all models. Maybe you’re thinking of the watch?


Different antenna perhaps?


Vertical scrolling on a webpage should NEVER mean horizontal scrolling. T_T


That got me too. There is a real disconnect when I am swiping down on my magic mouse and the page is going sideways. They broke their own HIG with this one...


I had to go and see what all the scrolling fuss was about.

That sideways scroll is so irritating, and just when you (hopefully) start getting used to it, it starts scrolling down. :facepalm:


"A12X Bionic is the smartest, most powerful chip we’ve ever made. It has the Neural Engine, which runs five trillion operations per second" but can it mine?


The hardware is amazing and for drawing/art is one of the best devices out there, but for many tasks (such as programming) it's basically useless.


I expect this iteration of Macbook Airs to be the last one.

When the time for the next refresh comes, Apple will have migrated their portable line to these iPad convertibles.


If only it ran macOS.


I want to try and charge a MBP off the iPad Pro!


It probably won't though I have been screwed a couple times when my USB power brick tried to charge itself off my phone because I hadn't pushed the button to switch the port from drawing power to delivering power.

Is there some sort of protocol in USB-C for two devices with batteries to negotiate which should charge the other. Though I imagine the iPad could just refuse to deliver the high power states/not have the circuitry.


It works for Macbook to Macbook charging, so why wouldn't it work with the iPad?

Note that your device doesn't have to provide the requested power if it is unable to, it can provide less.


Doesn't the Macbook require a certain minimum power to be available before it will start trying to charge though. IE if I plug it into an old .5A USB charger would it try to charge off the 2.5W? That's most of what I meant when I was talking about providing enough power.


It charges quite happily off the 3W max which is standard on most non-power-delivery USB-C ports (inc Apple's, iirc)


I bought an iPad Pro (12.9") in mid 2017, and am already seeing significant slowdown. With 4-5 apps open, it lags when trying to scroll the home screen. I frequently need to kill open apps by going into the app switcher.

When I bought a new iPhone, the difference was night and day. I regularly leave 30+ apps open without worrying about lag.

Not sure why the iPad Pro struggles so much with multitasking, but it's kind of turned me off from buying another.


You might try a fresh install. I have the same iPad, and haven't seen behavior like that.


I have the 2nd gen iPad Pro (2017) and I haven't seen this kind of behavior. Maybe time of a fresh install, you shouldn't need to do that but sometimes it helps.


I bought one when it first came out I 2015 and it is still running fine.. Maybe reinstall or check the battery? Does it discharge quicker than it should?


> I frequently need to kill open apps by going into the app switcher.

This doesn't help most of the time.


Can I add a mouse yet?

The "what's a computer" ads are disingenuous because Apple needs to sell computers and a tablet.


Does anyone know if these new models have 120Hz displays? I couldn't find any mention in the tech specs.


They do. Apple calls this technology ProMotion and it’s listed in the specs.


Finally the beginning of the end for the Lightning port! Hopefully next years iPhone will be USB-C as well.


Can I add a mouse yet?

The what's a computer ads are disingenuous because Apple needs to sell computers and a tablet.


I don't see an audio jack in the specs. Am I right to assume the ipad is switching to dongles too?


Yes, they apparently dropped the jack here too. They probably did not say anything to avoid the same backslash as the iPhone.


Yes and the dongle will be sold separately. 1 step forward, 2 steps back.


No headphone jack IIRC. Not sure if they ship with an adapter in the box.


I think I will buy one of these, a new Mac mini and an MacBook Air. It was a successful event I think.


Astonishing, mainly I just can't believe they used horizontal scrolling on their page. UX? Nope.


It's all about the UX right?

I scroll down on my mbp laptop and the @#$@#$ pictures move to the right...


How long til they stop selling laptops or developing MacOS (as opposed to iOS), ya think?


I would have to imagine that they have a huge market share of Developer laptops. Unless they are willing to jeopardize all of that or integrate developer first productivity into a tablet which nobody has done yet, MacOS will stay.


How much revenue/profit do you think they get from that (possibly, imagined) huge market share? Compared to iOS?

I guess that is something we could research.

This isn't profit/revenue, but some brief googling suggests 77 million iOS devices shipped in Q1 2018, and 5 million MacOS devices in the same quarter.

It seems clear that devices like this one announced are at least intended to capture some of the previous market for MacOS laptops as well.

Market share doesn't mean much if the market isn't contributing much to their bottom line, and/or has an expense to produce out of line with it's revenue/profit.

Ah, here we go:

"The Mac's Waning Relevance to Apple" https://www.statista.com/chart/8817/mac-sales-as-a-percentag...

I think releases like OP will accelerate this trend further.


That horizontal scroll web page was the most jarring thing my brain has done in a while.


That camera bulge!

I need a new iPad pro but that really makes me think twice if I should get the older version (and save some money too).

I'm also really unhappy about needing a dongle to connect my headphones. I use my headphones with my iPad daily. I'm also not too happy about needing two chargers, one for my phone and one for my iPad.


I really loved my iPad Pro, but the point really holding it back from replacing my laptop is the browser. It's a beautiful retina 12.9" screen, but is forced to use the mobile Safari rendering engine. Give me a full browser, or let Chrome do it. What a painful limitation for a great potential daily-driver.


they could have called it the iSurface.

their feature page is a 1:1 match to Microsoft 's surface page. Even down to the scree x body ratio (which is actually larger on the surface 4). And having face unlock only (no fingerprint).


Does anyone know if the iPad pro will support a mouse via USB-C or Bluetooth?


No, it will not. The UI Frameworks do not support the concept of a cursor (except for text entry, where it is a bit different than on a computer).


Something in the simulator supports a mouse, so I wonder if it's down in the code somewhere, just not enabled.


macOS passes mouse events to the simulator app, which translates them into touch events. Unless you’re talking about something else?


That is what i'm talking about, but I didn't know if it was macos doing it or what.

I've seen videos where there is a faint circle representing where the cursor is and then it 'touches' the screen.

shrug I just wasn't sure if that was part of the ipad itself (like maybe accessibility) or all from the simulator.


> I've seen videos where there is a faint circle representing where the cursor is and then it 'touches' the screen.

That's coming from the simulator, IIRC. Hold down the alt key and it will show up.


this mouse already works with the ipad:

https://www.swiftpoint.com/store/swiftpoint-gt-mouse-2/

I haven't tried it just came across it the other day


That only works with their vnc software. So, mouse on windows, through remote desktop running on the ipad.

The iPad itself still has no concept of a cursor.


Oh I didn't realize that. Thanks for clarifying.

Funny good old HN I lost a point for not knowing that though I mentioned I came across and never used this product lol.


How so? iOS doesn’t have a cursor.


> Translation: It’s faster than most PC laptops

Care to give actual data to substantiate that, Apple?

> A12X Bionic delivers 2x faster graphics. (footnote: compared to the previous generation).

What does that even mean? The footnote is even more obscure than the marketing piece. "Fast" measures what exactly?


> Care to give actual data to substantiate that, Apple?

The chip seems like its designed to beat benchmarks honestly. The A12x has an incredibly HUGE L1 data cache of 128kB, but only has an L2 level of 8MB after that (while your typical Desktop / Laptops have 64kB L1 data cache, 256kB or 512kB L2 cache, and then 8MB L3 after that).

Anandtech recompiled SPECint2006 for the iPhone A12 processor (and the A12x is a faster version than the A12) and it performed well in benchmarks: (https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-re...)

This includes a suite of tests from perl, to GCC, to h264 (CPU-only) encoding. Now, it isn't "official" SPECint, but Anandtech hopefully used the correct compile flags for the test. (We just gotta hope anandtech did everything correctly to make it an apples-to-apples comparison).

> What does that even mean? The footnote is even more obscure than the marketing piece. "Fast" measures what exactly?

Fast graphics generally refers to the ability to run graphic benchmarks well.

So benchmarks like Ice Storm (OpenGL) would run faster on the iPad A12X. GPUs need to calculate where all those triangles go after affine transformations: there are a lot of matrix-multiplications that go into GPUs.

Its a bit of a special case, so this is generally called "graphics performance".


what is the max resolution of the external display?

also in the video editing app shown in the keynote the external display hides the control elements, which is very nice. exposed api, or internal apple magic?


Is that a camera bump? So it doesn't lie flat on the table?


The camera does protrude from the back, yes.


Sure it looks cool now, but how long until they nerf it and break it like they did the Ipad 1. I can't even use the app store anymore on my ipad1 because apple broke it. I'm certainly not going to buy this anytime soon.


handwriting recognition

Is anyone using an iPad pro missing in-place handwriting recognition? as in the old Samsung tablets where you could annotate and write everywhere?


Does Apple plan to support even a proper web browser?


I wish so much we can run MacOS on those.


I suspect the future is the opposite of your wish, that eventually there will only be iOS and no more MacOS.


Biggest change was the Lightning -> USB-C port change


That site is horrible. First it scrolls horizontally and then it switches to vertical towards the end. Why do "designers" have to reinvent simple things again and again? Is there nothing better to do?


I'd argue while the UX is questionable (panning the page right to left while the user scroll down), the website loads pretty fast, which is a good thing, and the scrolling is smooth. The page also weights less than 14 MB.

Here is an example of horrible UX plus horrible loading time plus janky scrolling brought you by Google. 60 seconds after the page was open the content of the top of the page was still not done loading (72MB before I closed the page).

https://store.google.com/us/product/pixel_3

I think it's hardly just a matter of design here.


>less than 14 MB

It's kind of crazy to me that even that's considered low


I remember when Firefox 2.0 came out, it was 6 MB.

Way more impressive, though, was Opera 9 at 5 MB, which, unlike Firefox, also had a built-in email client, IRC client, download manager, BitTorrent client, ad blocker, debugger (Firebug was an extension at the time), sync, and several other impressive features.


Agree. And that people feel the need to say "At least it works without Javascript". Our standards have become really low. I would argue that for product sites like this we have massively regressed over the last 10-15 years. Sites that used be easy to read and find information have become a nightmare of one useful paragraph per page.


Since this is a product page, I wonder how much of that 14MB is binary data vs. how much is HMTL/JS/CSS. Binary stuff tends to inflate sizes by a pretty big amount, and 14MB wouldn't be big if you've got a lot of high-resolution hero shots. 14MB bundle size would be ridiculous, though.


Well, I guess it's a little crazy if all you were expecting was 8-bit graphics. But for content like that, yeah, I'm amazed it fits in 14MB.


It’s mostly high-res images, so I think that might be OK in this case.


Why do they need these monster images?


So it looks nice when people browse the website on high-resolution screens?


I guess this is a good example of what's wrong with today's web designs. There is almost no content on the page,scrolling is f...ed Up, but we get some nice looking high res images.


I get horrible scrolling on the Apple one and smooth as butter on the Google one. This is Chrome on a 2013 MBP OSX Mojave. I will say that the Google one takes way too long to load things for sure.


Less than 14mb is low? I don't see why that page should be more than 1mb or just around that


> Less than 14mb is low? I don't see why that page should be more than 1mb or just around that

If you carefully read my message, nowhere I'm saying 14MB (not mb) is "low".


I agree, but I gotta give it to them that it works great with javascript disabled. Becomes a normal, vertical-scroll website. Better that with javascript in my opinion.


I did that for many years but got fed up of landing on blogs that would display a blank page because you absolutely need javascript to render a few paragraphs of text in a browser.


Maybe it should be a vertical scroll site no matter what? What does horizontal scrolling add?


I couldn't agree more, why is a simple vertically scrolling page something that needs to be "spiced" up? I couldn't find a way to swipe on my ThinkPad non-touch laptop. I ended up using right arrow keys.


Horizontal scrolling...you try it once because you think it might be cool - then you realise somethings are an archetype that shouldn't be played with, like vertical scrolling websites.


The first marketing picture shows a landscape iPad as the main image to help connect you to the fact that it's not a tablet, but a computer. I think the side scrolling is conscious choice which is supposed to connect you to that fact as well. I don't expect the horizontal scrolling decision will carry on longer than this page.


Yup, this is what happens when you have a in-house design department (I don't know if Apple does though to be honest). They always need to be working on something which means they will always be trying out new things, for better or worse. It's how we ended up with the YouTube and reddit redesigns, both universally disliked.


Even worse, agencies will now see side-scrolling websites as "in" again. Expect to see a return of the dreaded side-scrolling corporate website :-P


Why does using two-finger scroll vertically result in the page moving sideways? I think my touchpad driver is messed up, better visit the genius bar.


Hey on IE it just scrolls vertically from start to end. To bad the rest of IE sucks.


The person who decided to make this scroll sideways instead of down when you scroll down, and not to scroll sideways when you scroll sideways should be put in stocks in the town square.


Maybe you didn't get the memo, but it's actually an early release of the 'new' way scroll.

Scrolling 1.0 = normal scrolling

Scrolling 2.0 = left/right up/down directions are inverted respectively

Scrolling 3.0 = left/right scrolling only, via up/down gestures (use left/right gestures at your peril)


> Scrolling 2.0 = left/right up/down directions are inverted respectively

Man, this thing drives me nuts, and it's the first thing I change in every touchpad-equipped device I own.

Scrolling is not swiping, people!


I'm the complete opposite. Touchpads not using reverse scrolling feels so weird to me.


It's the touchpad version of whether joystick-down means nose-down, or nose-up, in video games. Endless debates can be had!


Joystick down means nose up because it's simulating the yoke of your airplane in which you pull back (joystick down) in order to "pull up" the nose of the plane.

The recent fad of reversing the touchpad scrolling direction is an attempt to simulate how we use our phones in which we tap the content on screen and drag it around to pull other content into view.

Because the touchpad was invented to replace a mouse, I expect similar motions to produce the same results. For example: scrolling the wheel down moves your viewpoint in the current window down, so a similar motion on the touchpad (swipe down - use two fingers to differentiate from a normal touch) should produce a similar result. Instead, we get manufacturers who by default include un-intuitive configurations of something that had been working for decades for billions of people. At least for now we are allowed to change those settings.


> Joystick down means nose up because it's simulating the yoke of your airplane in which you pull back (joystick down) in order to "pull up" the nose of the plane.

Of course, but that's just a metaphor. Lots of people pick up a game controller having never used a joystick or a flight simulator before, so lots of arcadey flying games do it the other way, and receive scorn from people who prefer yoke-style controls. Who are, in turn, considered uptight and annoying by people who want arcade controls.

Witness endless discussion on the subject:

https://www.giantbomb.com/forums/general-discussion-30/inver...

https://battlefront-forums.ea.com/discussion/76095/let-us-cu...

https://www.supermodel3.com/Forum/viewtopic.php?f=2&t=1101


That may be because you use lower quality input devices. On Apple computers, the touchpad sensitivity tends to perfectly match the screen movement, making it very intuitive that pulling a screen down would pull it down. A manufacturer copying this with inferior hardware sensitivity and feel would indeed make it feel frustrating.


Input devices should allow users to configure a velocity mapping curve (which could be negative to reverse the motion). X-Widows had a crude threshold-based mouse acceleration scheme, but it's better to have an arbitrary curve, like the TrackPoint uses, that can be optimized for the particular user, input device, screen size, and scrolling or pointing task at hand.

https://wiki.archlinux.org/index.php/Mouse_acceleration

One of the patented (but probably expired by now) aspects of the Trackpoint is that it has a very highly refined pressure=>cursor speed transfer function, that has a couple of plateaus in it that map a wide range of pressures to one slow or fast but constant speed. The slow speed notch is good for precise predictable fine positioning, and the fast speed notch is tuned to be just below eye tracking speed, so you don't lose sight of the cursor. But you can push even harder than the fast plateau to go above the eye tracking plateau and flick the cursor really fast if you want, or push even lighter than the slow plateau, for super fine positioning. (The TrackPoint sensor is outrageously more sensitive than it needs to be, so it can sense very soft touches, or even your breath blowing on it.)

Here's a description of Ted Selker's work and all the refinement that went into the TrackPoint, that he developed at IBM Almaden Research Lab:

https://news.ycombinator.com/item?id=9438461

Highlights:

In 1984 he observed that it took 0.75 - 1.75 seconds to reposition the hand from the keyboard to the mouse, which is a long time for something that you do quite often. `

He tried many different ideas and built several prototypes, then later when he was working at IBM Alameda Research Lab, he had a chance to refine the idea into a product.

He had his father, a material scientist, help by designing the special non-skid rubber that the clitoris was made from.

IBM wouldn't let him ship it until it was measurably as efficient as a mouse for common tasks.

The thing going for it was that it eliminated the 0.75 - 1.75 second hand repositioning penalty, but of course the fundamental problem with it that you can't get around is that it's a relative positioning device, not an absolute positioning device like a mouse. So he had to come up with ways of overcoming that problem.

The trackpoint performs very well for mixed typing and pointing tasks, since you switch between typing and pointing so often, and that adds up to a lot of time, and is a very common way of using computers. The mouse is still better for tasks that are mostly pointing and clicking, but it takes up some prime real-estate on your desk, and there are many situations where a mouse is impossible to use with a laptop.

He also made the observation that when the cursor moved above eye tracking speed, you tended to lose track of it. And also the observation that some of the time you needed to position it finely around a small area, and other times you needed to move it quickly across a large area.

So he came up with a pressure-to-speed "transfer function" that had a non-linear mapping from how hard you were pressing it to how fast the cursor moved.

The mapping had a plateau at "predictable fine positioning speed" (i.e. there was a wide range of light pressure that would map to moving the cursor at one exact slow predictable speed, so you could smoothly cruise the cursor around with a light touch at a speed that was good for exact positioning. Then after the plateau of light pressure, it sloped up smoothly until just below eye tracking speed, where there was another plateau, mapping a wide range of harder pressure to a fast-but-not-so-fast-that-you-lose-track-of-it speed, for coarse positioning without losing the cursor. And then above that there was a fast speed for quickly flicking the cursor to the other side of the screen.

Once I was sitting in a coffee shop in Mountain View hacking on my Thinkpad, and Ted and his wife Ellen rolled in, sat down, and started chatting. Ted noticed that my Thinkpad's Joy Button was all worn down, and he was mortified and quickly excused himself to go out to the car. Ellen rolled her eyes and shrugged, explaining that he was always like that. Then he came back with a big bag of red Joy Buttons, and replaced my worn-out one right there in the coffee shop, and gave me a few extras as spares!


Good information, but it's not a clitoris.


The original name was the "Joy Button", but that was too much for IBM.

So after pooh-pooh-ing the name "Joy Button", IBM finally settled on and trademarked the name "Trackpoint." But one concession they made, was when they published a two page ad spread in Time Magazine with a close-up of the Trackpoint, above the slogan "So hot, we had to make it red!"

Ted Selker also made a prototype Thinkpad with TWO hot red Trackpoints on the keyboard, which invitingly resembled a pair of nipples. It was very popular with everyone he tested it on, but unfortunately OS/2 had no idea how to cope with two pointing devices, so there wasn't much use for it, besides being a wonderful ice breaker at parties.


"Clit Mouse". It's old slang. Not the best, but the best way to get someone to realize what you're talking about.


Don't forget, Scrolling 3.0 is only available on this year's New ScrollBar(tm)


Requires ECMAScript 2021, but transpilers should be available soon on npm.


It’s only available in rose gold.

It can be yours for the bargain price of $1999.95 USD.

Don’t be the last person in your herd of Apple users to pay money to add scrolling functionality back to your MacOS and iOS devices.


Courage.


I think it's kind of cool. /dissenting-opinion


I'm presuming you viewed it on a mobile device. On my laptop running firefox I had this experience.

My laptop is presently docked to the right of my monitor on my desk. When docked I use a keyboard and mouse with the actual laptop being slightly to the right using its touchpad is somewhat awkward but do-able.

Space bar to go down doesn't work, page down/up doesn't work home end doesn't work arrow keys don't work. Try clicking on the little arrow that doesn't work. Try dragging the screen that direction with the mouse doesn't work. Grab scrollbar manually and pull it down doesn't work. I literally skipped that section because I couldn't figure out how to read it. Note I didn't realize you could grab the scroll bar manually do scroll horizontally because I have never had to do that.

After reading this thread I tried reaching over awkwardly to use the 2 finger scroll mode on the touchpad keeping in mind the normal sensitivity is such that one full swipe top to bottom about 2 inches is 2 pages of text. So about 1 inch per page of text.

Swiping through the first section the "overview" took 62 awkward swipes left to right about 2.5 inches each. About 155 inches of travel almost 13 feet for the last few it actually went down while I scrolled sideways which made me pause for a moment.

Scrolling through pages of content is so basic that going through the massive work of making a beautiful device and all the software required to run it and making reading about it that painful is remarkable.

Its like launching a new high end clothing store and making the entrance a dance dance revolution pad where one must complete a hip hop number to enter.

It isn't merely subjectively terrible.

Edit: On my phone also running firefox in vertical orientation it scrolls normally but cuts off a bit on the right hand side, in landscape orientation it scrolls normally AND looks correct.


>I'm presuming you viewed it on a mobile device. On my laptop running firefox I had this experience.

I viewed it on Firefox on my MBP and thought it was great.


Did you have to scroll right 62 times to actually view all the content?.

Were all the normal keyboard/mouse wheel navigation methods broken?

Seemingly good design would imply working for the 90% that don't presently use mac or detecting that a device isn't a phone and presenting a usable interface.

Seriously how do you screw up something simple like scrolling?


Firefox 63 on Mac works fine for me on this page. Spacebar, page-down, and scrollbar all work (i.e. the content moves a page at a time but horizontally).


Chrome 69 on linux scrolls horizontally at a much saner speed and has working arrow keys but is a fail in every other way mentioned. Firefox 65 on linux is a horrific user experience as described.

An article unlike a computer isn't a new exciting tech marvel it should be simple and work on any device that can render text and multimedia.


I've tried Firefox, Safari, and Chrome. And in every single one of them, scrolling works fine with spacebar, with the mouse, with Page up/Page down, and even that vim*-extension's 'd'.

I also enjoyed the effect. For anything I visit repeatedly, such as webmail or github, or news websites, I prefer a "lean" implementation. But a product page that I will visit may twice in its lifetime? Knock yourself out!


Novelty and breaking expectations regarding established HCI conventions are different sides of the same coin, I guess.



if you scroll long enough, it switches and starts scrolling down.


I thought you were joking and then it happened and I was still startled.


Steve is rolling over in his grave on this.


*Flipping head-to-toe


No Steve, please don't do this. This will mess up everything.

https://en.wikipedia.org/wiki/Tennis_racket_theorem


I'm pretty sure that Apple was using parallax scrolling back in the days of Steve Jobs.

There's also a strange contradiction in these millions of people always complaining how Jobs would hate this or that, while essentially relying on the implied authority of the name!

If you all are capable of accurately predicting Steve Job's hypothetical opinions on all matters of design I wanna see some products you designed.


No, that's not a valid argument.

Apple[Jobs] produced a steady stream of industry game changers - original iMac, iPod, iPhone, iPad, even the MacBook Air. Plus supporting ecosystems and software, including Siri and iCloud. And business relationships - with music companies, movie companies, phone carriers, podcast creators, and app devs.

Apple[Cook] has acquired a mountain of cash, but product development has followed a consistent and predictable faster+thinner+more_expensive path.

No game changers. Not one. Watch is the closest thing to a new class of products, but it seems to be an Ive vanity project rather than something that really scratches a user itch that users didn't even know they had - which was the classic Jobs USP. (And - ironically - Watch isn't obviously a design classic.)

No big new ecosystems. No big new business relationships - unless you count far-from-leading options like ApplePay. No game changer products. Just endless glacial refinement.

Which is fine for generating mountains of cash, especially when you're selling into new markets like China.

But you can only keep producing reruns for so long before you lose momentum. And buyer good will isn't infinite.

Luckily for Apple, Windows is still an incredibly terrible POS as an OS, so the motivation to switch isn't there. Nor is MS ever likely to do better for cultural reasons.

And Linux is - Linux.

So Apple can continue on this path for five more years. Maybe ten. But when the next new thing appears - quantum, bio, plain old distributed, whatever the hell it is - Apple will be buried by it.


People have been saying this since the day after Steve died... Yet somehow Apple still thrives.


In the same fashion that microsoft and ibm thrives: its lost its dominance in the consumer market “headspace”, printing money but no longer being a company worth knowing. Its just another BigCo at this point, with all the benefits and dullness associated with it. (Eg it’s difficult to be excited for apple product announcements; they have a money-printing operation and their only intent is to maintain it as-is; naturally any updates will be rather mundane, safe iterations on the running system)

In other words, apple became boring, or at least has been boring for a while now.

I mean, the whole computer market is boring (tablets, phones, laptops, etc; all the same shit YoY), but its kind of hard to think that current-apple is going to break that trend, in the same way you wouldn't expect IBM or Xerox to. Don't know of any startups doing anything terribly interesting in the space, with any real chance of success.

Hell, at this point Microsoft might be the likeliest candidate: at least they're doing something with stuff like stupidly big conference room screens and maybe Hololens will do something cool.


You concluded all this from horizontal scrolling?


The person I was replying to was not referring to horizontal scaling, and neither was I.

Just because a conversation starts with a topic doesn’t mean it stays there.


Is it your conclusion that they concluded all that from horizontal scrolling ... ?


I'm not saying that apple won't thrive - nor anything like it.I'm merely saying the interface design would probably have made Steve fire the person who did it.

And I probably wouldn't argue.

After all, when viewing the page on an iPad, I can't double-tap the top of the screen to make it go back to the beginning. Irony at its best.

Seriously - that's a great feature that I use multiple times per day, and Apple broke their own rule!


I think Apple has become enough of a

crosses self, prays for forgiveness

/lifestyle brand/

...that it's going to take a massive misstep of some kind to shake people off the platform. A handful of UI misfeatures won't do it, even if it would do it to users of other software and platforms.


Wouldn't it funny if sometime in a few years you'll need to take out a loan or lease the apple products instead of being able to buy them outright.


Cook is motivated by making his hill of beans as big as possible, rather than by being a super-cool dude who is the envy of other super-cool dudes. So I fully expect lease-with-no-chance-to-own to become the business model for some of the Apple range.

At least - Apple will try it. But I don't think it's going to fly.

Part of the Apple Magic[tm] is conspicuous display of ownership. Rent-to-wear won't have the same cachet.


It’s been a long time and he would have been angry about something by now. The question is, what?


*scrolling


And the scrolling changes midway ...


Ah well since the days of <forgot-the-name>.js slides framework I lost faith :D

We need fresnel refractive scrolling. Perfect for Retina displays, and to teach cosines to the laymen.


For the record, it seems to work both ways with desktop Safari.


And Chrome! (maybe they fixed it)


Also works correctly on iPhone safari


Agreed. It took me several seconds to figure out what to do, as the scroll-wheel on my mouse did nothing. I had to grab the horizontal scrollbar on the bottom of my browser and drag sideways.

I imagine it works well on a phone in portrait mode. On a desktop with a 16:9 monitor, a horizontally scrolling webpage is... bizarre.


It only lets me scroll sideways, so I cannot even use the wheel (even when it gets to the end where it scrolls vertically for a bit). I have to scroll lock the mouse and move the whole thing from side to side. That alone makes me close the window and not read the content.


This page works correctly only with JavaScript disabled. I wish more websites were like that.


That's weird. It works normally in the "Matterialistic HN" app's built in browser.


It's locked me up twice now.

Version 70.0.3538.77 (Official Build) (64-bit) Windows 10 Professional Nvidia drivers


On my phone it scrolls normally but the left and right of most images are cut off. Latest chrome.


Especially with a mouse wheel. It's just off in terms of coordination


Official Apple response: "You're holding it wrong."


It's so fucked up on my linux thinkpad that it's funny


It is super lägg to scroll as well. The scrolling is bad and the website is too heavy to scroll properly on my iPad pro 9.7 with Safari. I guess they don't dogfood their stuff after the IOS12 release.


Ive heard that its bad marketing to say things are bettER.

BrightER, biggER, fastER.

This is what I've seen Apple resort to over the last few years.

This is very concerning for a luxury product manufacturer. However my reluctance to declare Apple 'myspace' is that there is a gigantic amount of resources and effort to change from the Apple ecosystem. A user likely wont gamble that a 400 dollar Android is going to be better than an iphone. They wont try something new. A switch from myspace to facebook was basically free.

I've gotten to try both as I purchased one phone, and got the other for work. HN downvotes anything anti-apple, but I was surprised when things 'didnt just work'. Having to hit the play button a few times on my podcast screen, siri needing voice recognition was nothing like google, and a strange bug using apple maps causing my phone to reboot.

That said, I can see their death being from a lack of reason to upgrade. Users deciding their phone/tablet is good enough and they'd rather spend 1,000 USD on other things.

Since I havent experienced it, what is the appeal of Apple in 2018 and beyond?

EDIT: lol never change HN whales, never change


Given the strength of their product reviews and user satisfaction numbers it’s clear that your experience isn’t universal. That and their incredibly hard won brand loyalty and association with high value customers (when was the last time you saw an android phone being used for a product photo of a cross platform phone app?) means the wheels are unlikely to fall off anytime soon.


>product reviews and user satisfaction numbers

How would someone know what expectations are? It is very expensive and risky to try a new phone.

This was mentioned in the original post.


Happy iPhone user since 2018 and no regrets so far. I used Android for 7 years before that and I don't miss the constant crashes and the bad customer service I had with my HTC and Samsung phones (both flagship phones that cost the same as an iPhone).


Across the board Samsung products are medium quality at best. Sorry you thought it was a flagship phone. I avoid that brand from home appliance to TV.

Never had issues with HTC, loved the HTC Incredible phone.


Is this running macOS, 'coz I'm seeing a dock!


iOS has had a dock since iOS 11.


iPhoneOS has had a dock since version 1. They've just expanded what it can do. And they used to even push that iOS and OSX (at the time) shared the same base, but they've backed off that pretty hard despite it remaining true - Darwin underneath.


Ok, correction: a Dock with a large number of apps in it.


I don't want the thinnest iPad of all times, or "2x faster graphics", I want same frickin' device with at least twice as long battery life!


If there was the demand for it it would be easy to make a battery pack that mirrors the size and fits on the back, no?

Might just block the usb-c, but I thank that would be fine. If you need tho port you can run from the internal battery for a while or get an adapter with power-through for a more permanent setup


Why is this on HN front?


Because iPads are popular products made by a large technology company?


"iPad Pro faster than most PC laptops"

As shown with keyboard and pencil, the apple care without which you will pay out the rear to replace the battery god forbid you drop it a TB of storage and cellular capability you are looking at around $2500.

The average pc laptop costs what $700 and is probably at that 1-3 years old, not exactly a compelling argument to spend thousands of dollars and replace the laptop with a tablet.


The device is very nice, but it does not provide enough of a value proposition for me to want to buy one.

It looks great for consuming content (though the focus on gaming when this device is supposed to be for professional use is odd), but the operating system itself limits the ability to produce anything more than trivial content or development.


> the operating system itself limits the ability to produce anything more than trivial content or development.

I think that touch UI has constraints that are tough to overcome, but I strongly disagree that the operating system is a limit.

Apple keeps exposing different ways to integrate apps, create workflows, even work with files directly.

And of course the problem with this debate always includes fundamental mismatches over what type of content you're interested in, or what you consider "trivial", but Photoshop isn't for trivial content.


> even work with files directly

Gosh! What an incredible new feature


I'm a hobbyist photographer and I edit all my photos in Lightroom on iPad. It's so much more fun and engaging to edit on iPad than on my iMac, not to mention easier (Apple Pencil makes touching up specific regions a breeze).


Apologies for not making a distinction. I was referring more to business documentation and software development. I failed to see the obvious uses in art and design.

I see that it would be useful in the field, but don't you find the lack of a second screen an issue? I always like to have my file browser and palettes away from my main work area.


To quote the princess bride: 'You keep using that word ['pro.']. I don't think you know what it means.'

Why on earth does Apple have the audacity to call it an 'iPad Pro', if I can't run any of Apple's actually 'pro' software on it?

Adobe beat Apple to actually getting the first piece of real software on the iPad, with Photoshop.

There's two applications in particular, that, beyond xCode and iOS development, keep me locked in the Apple ecosystem and have for more than 15 years.

'Final Cut Pro' and 'Logic Pro'. The day we actually see Apple's own 'pro' software on an iPad is the day we can call it a 'pro' device. It's literally just a larger version of the iPad with a pencil.

Much less the fact that it'd be great if we could actually develop on the thing...but that's a pipe dream. Expecting Apple will put it's pro software on it's pro device...well, I guess it would be almost as an extreme request as maybe their flagship phone and flagship computer be able to connect to each other out of the box?

I'm confused as to why we've got USB-C in this new iPad Pro and not in the iPhones they released not a month ago. Where is the consistency? Where is the logic? (no pun intended)

It was like in 2017 when they decided to label a laptop a 'pro' unit and soldered a non-upgradeable paltry 16GB of RAM to the board and threw literally a colourful toy in where our actual keyboard buttons used to be.

Or the fact that we've been promised a new Mac 'Pro' and are still stuck with the same poorly-received and certainly-not-pro garbage can we've had for years.

Even the new Mac Mini is more powerful than the 'Mac Pro'.

I'm a seriously 'pro' user of Macs. I have been for a decade and a half. And I'm tired of them using that word. Can they just stop?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: