Jump to content

[UPDATE] Wonderlust – Apple September event; new iPhones and watches

Lightwreather
Go to solution Solved by Lightwreather,

So, here are all the announcements, and stuff that happened:

Apple Watch Series 9

-Largely the same design, the most significant changes are to the internals

-Improved performance, but more crucially, longer battery life, are promised by the new S9 chip. Apple claims 60% more transistors than the Series 8 and a GPU with 30% more transistors. The neural engine's significant improvements to on-device processing for Siri requests, including 25% faster voice dictation, may be the most significant.

-Display now goes up to 2000 nits, double the brightness of Series 8, making it easier to use outdoors. Also goes down to a single nit in dark conditions

-Addition of an Ultra Wide-Band chip to show you the distance and direction to your phone, rather than simply having your phone make a loud noise.

-Apple also promised a new gesture - "Double Tap" that it claims Watch users will be using "every day". This works supposedly by the Neural Engine's detection of "the unique signature of tiny wrist movements and changes in blood flow when the index finger and thumb perform a double tap." Bit skeptical if this will actually catch on though.

-FineWoven is a new watch strap design that Apple is phasing out of its whole product line to replace the leather. FineWoven is a "microtwill made of 68 percent post-consumer recycled content that has significantly lower carbon emissions compared to leather," according to Apple. 82 percent of the yarn used to make the new Sport Loop is recycled.

- Supposedly, the company's first carbon-neutral device.

The Apple Watch SE remains available for $249, while the Series 9 starts at $399. They're both available for preorder today and should be released on September 22.

image.thumb.png.0d40c1b4125c596e50f996f23130ca67.png

 

Apple Watch Ultra 2

-The Ultra 2 has the same new S9 chip as the Series 9 alongside the same "double tap" feature.

-There's a new display that hits 3k nits, even more than the Series 9. (Sidenote: How bright would be too bright?)

-There's also a new "modular ultra" watchface that uses the edge of the display, “to present real-time data, including seconds, altitude, or depth”.

-There's also now support for Bluetooth cycling accessories, and ANT+ support.

-The battery is the same, hitting 36 hours on a single charge and 72 hours in low-power mode.

 

iPhone 15 and 15 Plus

Why, it's star of the show, and was literally what everyone was waiting for, waiting agonising minutes after it was revelled to see if a highly anticipated feature would be released. It was.

 

-Apple ditched its proprietary Lightning port in favour of USB-C (albeit on the USB2 standard; some claim it's because of SoC limitations) on this iteration of the iPhone, very likely because of the EU.

-The edge of the aluminium enclosure has a new contoured design that looks a bit different from the iPhone 14 and gives me a Pixel 4-esque vibe

-Apple also claims the iPhone 15 devices are the first phones to have a "color-infused back glass." Apple's announcement said that it strengthened the phones' back glass with a "dual-ion exchange process" and then polished it with nano-crystalline particles and etched it for a "textured matte finish."

-Dynamic island was introduced.

I take minor issue with Apple calling it an "all-new design" when it really isn't that different from the iPhone 14 and 14 Pro but I suppose that's a me thing.

 

-Spec bump to the A16 which was on last year's Pro line-up

-Main camera system was upgraded to a 48 MP sensor that takes 24 MP photos, using a default computational photography process. This is the same system found on the iPhone 14 Pro. Alternatively, you can opt for the "2x Telephoto" option with three "optical-quality" zoom levels (0.5x, 1x, or 2x).

-Additionally, for all of the iPhone 15's cameras, machine learning will automatically switch the main camera into portrait mode, with richer color and low-light performance, when appropriate. Night Mode ("sharper details and more vivid colors") and Smart HDR (brighter highlights, and improved mid-tones, shadows, and renderings of skin tones) are reportedly upgraded, too.

 

-The cameras will also introduce focus and depth control, which lets you switch focus on an image from one subject to a different subject after the photo has been taken.

I will admit, that is pretty cool

 

-Also includes the second-gen Ultra Wideband chip (same as in the Apple Watch). Apple said it enables connectivity with other devices from up to a three-times longer distance.

-The iPhone 15 is supposed to have better audio quality during calls, thanks to a new machine learning model that automatically prioritises your voice and can filter out more background noise, if you select "voice isolation" mode during a call.

This is also pretty cool, and will absolutely be very helpful when yer trying to listen to someone talking in a very windy place. THANK YOU APPLE.

-Apple is also adding Roadside Assistance via Satellite with the new devices. Users will be able to text roadside assistance and then select what they need assistance with, with options such as "flat tire" and "locked out" appearing via a menu that comes up in response. The feature will debut in the US with AAA.

-The iPhone 15 starts at $799 (128GB), and the iPhone 15 Plus starts at $899 (128GB) available on September 22, with preorders starting this Friday.

-Apple also launched new case options for the iPhone 15 series, including the new FineWoven material.

 

image.thumb.png.605a45ad77ef302f304408a319a6baca.png

Apple (via ArsTechnica)

 

iPhone 15 Pro and Pro Max

-Apple announced that new iPhone 15 Pro line-up will switch from a stainless steel frame to one made out of a brushed "grade 5 titanium," which they says makes the phone more durable and lighter.

I honestly expected them to bump the price just because of this

-The phone is also a little smaller than past models thanks to slimmer display bezels. The screen sizes stay the same—6.1 inches for the base model and 6.7 inches for the larger iPhone 15 Pro Max one. Apple didn't announce any changes to the phones' actual screens, so expect the same resolution, ProMotion refresh rate, and brightness as before.

-The iPhone's tradition mute switch was replaced by an "action button" . By default, it still serves as a mute switch, but users can change it to launch apps or the camera or custom Shortcuts workflows, which I see as a neat inclusion. There will be a different haptic response depending on if the action mutes or unmutes.

-The Pro line-up will once again be a generation ahead of the non-pros with the introduction of the A17 Pro, which is apparently Apple's first chip built on TSMC's 3nm node. It continues to use two large high-performance cores and four high-efficiency cores; Apple says the performance cores are 10 percent faster than they were before, a relatively mild improvement, while the efficiency cores are more efficient rather than being faster.
   -The A17 Pro's six-core GPU is 20 percent faster than the A16 and Apple has also added hardware-accelerated ray-tracing. This is something I don't see as being particularly useful on a phone, but I did call out that this would be implemented by Genshin Impact and Honkai Star rail. Speaking of which RE4 will also be coming to the iPhone.

   -The chip also includes hardware acceleration for the AV1 video codec that is becoming more popular on streaming services.

-Apple has also added a new USB controller to power that USB-C port, allowing the iPhone 15 Pro (Max) to use USB 3 transfer speeds. 
"technically, this would make it either a USB 3.1 gen 2 or 3.2 gen 2 controller, if you can keep the USB-IF's naming straight" –ArsTechnica

-Apple says the Pro's 48-megapixel main sensor is larger than the one in the regular iPhone 15. Like the iPhone 15, by default, it will shrink the finished product down to 24 MP to save storage space, but if you're shooting in ProRAW mode you can get the full 48 MP image for cropping and editing. The camera defaults to a 24 mm focal length, but 28 mm and 35 mm options are also made possible by the large sensor, and you can set any of the three focal lengths as your default.

 

-Apple says that after an iOS update "later this year," the phone will be able to shoot spatial video that can be viewed in three dimensions in a Vision Pro headset.

-The iPhone 15 Pro starts at $999 (128GB), and the iPhone 15 Pro Max starts at $1,199 (256GB) available from the 22nd of September with preorders starting this Friday. FineWoven cases will also be available for the Pro line-up.

 

iOS and macOS

-iOS 17 hits supported devices on September 18

- macOS 14 Sonoma will be available on September 26, just over a week after iOS 17

 

Sources

Apple - [1] [2]

ArsTechnica – [1] [2] [3] [4] [5] [6] [7]

3 hours ago, RejZoR said:

Well, it is rocket science when you already have a designated chipset for the series, but you're essentially forced to change USB port (EU legislation).

The writing for the EU legislation was more than clear on the wall at the design time of the A16. Plus, sooner or later they anyways had to make the switch even without any legislation, and from a transfer speed point of view it was long overdue.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

My guess is that the actual I/O logic on the M1 and M2 is 2 TB ports (8 PCIe Lanes), with the Pro having 16 lanes, and the Max having 32. Maybe it has steerable lanes for all I know.  But the theme seems to be that the Mac Pro has 8 ports, the Mini can have up to 4, and the MacBook Air/Pro 2 or 3. Maybe there is something else in play but it does suggest that the M series does have all USB-C ports wired as TB4.

From memory the Mac Mini had different support allowances on the ports since you could buy it with M1/M2 and also M1 Pro/M2 Pro and more of the ports supported TB when selecting the Pro variant CPU. But you are right , each M variant higher supports more and more TB ports and connectivity I/O speeds in total across all ports etc.

 

Still the iPad Pro with the same M1 SoC had support for USB 4 while the iPad Mini with the M1 SoC was limited to USB 3.1. I think that has more to do with the USB PHY chip though as you still need one of those to drive the actual USB port and maybe it was either cheaper or physically smaller to go with a 3.1 at the time 🤷‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dracarris said:

The writing for the EU legislation was more than clear on the wall at the design time of the A16. Plus, sooner or later they anyways had to make the switch even without any legislation, and from a transfer speed point of view it was long overdue.

EU law does not even apply yet,  it apply applies to new products shipping after it becomes active so these new phones this year are not even required to have USB-C  and the rule itself does not require any of the optional USB-C features (such as USB 3+) the only thing it requires is to use USB-PD and even then only requires PD if the device support higher charging speeds than the generic 2.0 speeds... lighting has supported PD for years so there is no issue there at all.

The law was just about the physical connector and nothing else, I expect you could even comply with the law by having a port that does not even have all the pins needed for full spec as all you need to support is 2.0 charging so dong need all the pins. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

Personally, I am hoping that JPEG XL "wins" the war, because I feel like it's the best image format. 

 

Does JPEG XL support attachments, eg depth layers, little bits of video on the side etc.    Looking through documentation and frameworks people have published I cant find any clear way one would add these.   

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, HenrySalayne said:

Obviously not or it wouldn't be problem? Maybe everybody is using their device wrong, IDK, but since I end up with .heic pictures from iPhone users a lot, it doesn't seem to be as simple.

They're probably doing something in a weird way or have some odd setting. I've been on iPhone for 2 years now and never had a picture send to an Android user in a way they can't view it. Using the default messaging app. In their text I'll either pick the gallery to choose the picture to send to them, or take a picture with the in text camera mini app.

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, hishnash said:

Does JPEG XL support attachments, eg depth layers, little bits of video on the side etc.    Looking through documentation and frameworks people have published I cant find any clear way one would add these.   

Not 100% sure what you mean by "little bits of video on the side". Do you mean saving a small video clip inside the same container? If that's what you mean, then I think the answer is no. JPEG XL supports "video", but it's from what I know more like a series of still images rather than a "video file" (with inter-frame compression and all that). So for anything that involves video, one of the video-derived image formats like AVIF or HEIC will be superior.

Call me old-fashioned, but I kind of prefer keeping video formats and image formats separate. At least in a situation like this where we have to decide if we want two formats where each one is really good at one specific thing, or if we want one format that's really good at one thing, and okay at another thing.

 

It does support depth layers though. In fact, it supports 4099 channels, and a typical RGB image will only use 3. So you have 4096 channels remaining to fill with whatever data you want. Alpha channel for transparency, depth channel for depth data, maybe a thermal channel if you got a thermal camera. In that way, the format is very flexible. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, LAwLz said:

Not 100% sure what you mean by "little bits of video on the side". Do you mean saving a small video clip inside the same container? If that's what you mean, then I think the answer is no.

I'd guess he's talking about Live Photos (I don't know what Android calls this) - basically you take an image, and it records a little extra on both sides of the shot, so you can adjust the frame being used (great for quickly correcting things like out of focus, or someone blinking without retaking the photo).

 

I don't know how iPhones do this (whether it's a series of still images, or an actual video) - but I do know that you can "export" the Live Photo as a video clip if you want. That part is much less useful (very niche), but it's neat.

7 hours ago, LAwLz said:

JPEG XL supports "video", but it's from what I know more like a series of still images rather than a "video file" (with inter-frame compression and all that).

I imagine that would still be sufficient for the purposes of a Live Photo - the primary idea behind them is to allow you to tweak which frame is used.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, dalekphalm said:

I'd guess he's talking about Live Photos (I don't know what Android calls this) - basically you take an image, and it records a little extra on both sides of the shot, so you can adjust the frame being used (great for quickly correcting things like out of focus, or someone blinking without retaking the photo).

 

I don't know how iPhones do this (whether it's a series of still images, or an actual video) - but I do know that you can "export" the Live Photo as a video clip if you want. That part is much less useful (very niche), but it's neat.

I imagine that would still be sufficient for the purposes of a Live Photo - the primary idea behind them is to allow you to tweak which frame is used.

The primary purpose of live photos is to have a little video on either side, to capture the essence of the moment. Tweaking while frame you use results in a much lower quality photo, and should be avoided if at all possible.

 

Live photos make going through family photos WAY better/more fun. Losing them would absolutely not be worth moving off HEIC, were it required.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Obioban said:

The primary purpose of live photos is to have a little video on either side, to capture the essence of the moment. Tweaking while frame you use results in a much lower quality photo, and should be avoided if at all possible.

I don't agree with you here. In practice, very very few people are going to use Live Photos to "capture the essence of the moment". That might be what you specifically use it for, but Apple advertises it as a way to "pick the key photo and make edits", just like I stated.

11 minutes ago, Obioban said:

Live photos make going through family photos WAY better/more fun. Losing them would absolutely not be worth moving off HEIC, were it required.

I doubt most users would even notice or care, to be honest. It sounds like you use Live Photos in a particularly niche way.

 

With that in mind, I doubt that Apple will eliminate Live Photos anytime soon, unless they are forced to for some reason. HEIC compatibility is good enough at this point that there's little reason to eliminate it in the short term.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, dalekphalm said:

I don't agree with you here. In practice, very very few people are going to use Live Photos to "capture the essence of the moment". That might be what you specifically use it for, but Apple advertises it as a way to "pick the key photo and make edits", just like I stated.

I doubt most users would even notice or care, to be honest. It sounds like you use Live Photos in a particularly niche way.

I suspect people use it for just that all the time, because the iOS generated memories often use the video portion of them.

 

I feel like Apple's marketing of them is pretty in line with what I was saying:

https://developer.apple.com/design/human-interface-guidelines/live-photos

 

Quote

Live Photos lets people capture favorite memories in a sound- and motion-rich interactive experience that adds vitality to traditional still photos.

Link to comment
Share on other sites

Link to post
Share on other sites

I‘ve also never heard of Live with the purpose to chose the best frame - how should that even work given that every video format will be in severely lower quality compared to a still image?

 

Also, I‘m not sure the live video is actually stored inside the HEIC. Maybe it‘s simply linked in from an external file?

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, Dracarris said:

I‘ve also never heard of Live with the purpose to chose the best frame - how should that even work given that every video format will be in severely lower quality compared to a still image?

"Severely lower quality",  maybe - but the quality is still pretty high - I've personally never noticed a major difference when I've edited a live photo and chosen a different key frame. It's a choice you get to make as a user.

59 minutes ago, Dracarris said:

Also, I‘m not sure the live video is actually stored inside the HEIC. Maybe it‘s simply linked in from an external file?

I couldn't tell you how it works from a technical perspective, but everything is in the one file, since if I share that file to another iPhone user, they can see the Live Photo in full. Same if I download a backed up version of the file.

 

According to this very old article from 2015, the iPhone 6s (where Live Photos were introduced) takes a series of still images, animates them together, and records audio separately, then stitches the whole thing together.

https://9to5mac.com/2015/09/11/how-live-photos-works/

 

If this is accurate, then there's literally no downside to this, as all the frames will be the same quality as the original key frame.

 

Given that's how it's being described, I see zero issues with Apple using the same method to create a Live Photo using JPEGXL or any other newer format.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, dalekphalm said:

"Severely lower quality",  maybe - but the quality is still pretty high - I've personally never noticed a major difference when I've edited a live photo and chosen a different key frame. It's a choice you get to make as a user.

I couldn't tell you how it works from a technical perspective, but everything is in the one file, since if I share that file to another iPhone user, they can see the Live Photo in full. Same if I download a backed up version of the file.

 

According to this very old article from 2015, the iPhone 6s (where Live Photos were introduced) takes a series of still images, animates them together, and records audio separately, then stitches the whole thing together.

https://9to5mac.com/2015/09/11/how-live-photos-works/

 

If this is accurate, then there's literally no downside to this, as all the frames will be the same quality as the original key frame.

 

Given that's how it's being described, I see zero issues with Apple using the same method to create a Live Photo using JPEGXL or any other newer format.

That's not accurate/how it works. There is A picture file, and a much lower resolution video.

 

If it was a series of full quality photos animated smoothly, you'd have MASSIVE files every time you took a picture.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, LAwLz said:

It does support depth layers though. In fact, it supports 4099 channels, and a typical RGB image will only use 3. So you have 4096 channels remaining to fill with whatever data you want. Alpha channel for transparency, depth channel for depth data, maybe a thermal channel if you got a thermal camera. In that way, the format is very flexible. 

So no video side channel (I use this to store other ... non video vector data), from looking at apis I can fine it looks like the other channels however also need to be the same x,y size as each other you cant provide each with its own size/offsset/metadata.    Having the flexibility is nice as an app developer that can use it to store some of the original source data for an email within it so if users re-open the image within an app that edited it you can undo/alter old edits. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, dalekphalm said:

If this is accurate, then there's literally no downside to this, as all the frames will be the same quality as the original key frame.

 

Not all frames are the same quality at all, there are a few key frames that are of high quality and the rest are deltas applied to that using HEVC video encoding.  If you were to just do 60fps for 5 seconds at 24MP (10 bit) in JEPG-XL you would be looking at MASIV files as the standard does not do any frame to frame compression like a vide format does. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Obioban said:

That's not accurate/how it works. There is A picture file, and a much lower resolution video.

 

If it was a series of full quality photos animated smoothly, you'd have MASSIVE files every time you took a picture.

Feel free to provide counter evidence to how it works.

 

My point being that in practice, I've yet to notice any serious quality degradation when choosing a new key frame.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, dalekphalm said:

Feel free to provide counter evidence to how it works.

 

My point being that in practice, I've yet to notice any serious quality degradation when choosing a new key frame.

On photos for Mac, select a live photo. File, export unmodified original. Choose destination, click export. What you get at that destination is a video file and an image file.

 

Original photo:

dptW7IW.thumb.jpg.ff7231ca3c5725750e4d2c31f249fe8e.jpg

 

Changed key frame:

EPYGOMw.thumb.jpg.b6f6b4e7132b58992dcb26710d4f3d11.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Obioban said:

On photos for Mac, select a photo. File, export unmodified original. Choose destination, click export. What you get at that destination is a video file and an image file.

Thank you for explaining how you got your result - that is useful information. Granted, I'm not a Mac user, so this is of limited use to me personally, but I will concede the point that it's an image file with a video file.

 

That doesn't change my stance mind you.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, hishnash said:

So no video side channel (I use this to store other ... non video vector data), from looking at apis I can fine it looks like the other channels however also need to be the same x,y size as each other you cant provide each with its own size/offsset/metadata.    Having the flexibility is nice as an app developer that can use it to store some of the original source data for an email within it so if users re-open the image within an app that edited it you can undo/alter old edits. 

If you just use the "video channel" to store other data then you should be able to do that in the other channels in the JPEG XL container as well.

You can also save multiple images with varying dimensions inside the same JPEG XL file if that's what you're after. You can read a bit more about it on page 6 in this whitepaper.

 

 

Apparently, I was wrong about the inter-frame compression. JPEG XL can do that by marking a frame as a "reference frame", and then use that when encoding subsequent frames. Although I doubt it's as good as the video formats.

 

 

Anyway, HEIC will have the edge when it comes to things involving video (since it is a video format after all), but for photos/pictures, JPEG XL is unmatched.

 

 

It's also worth noting that Google is currently heavily against JPEG XL, going as far as removing support for it in their software. Apple on the other hand is the company with the best support for JPEG XL, supporting it in pretty much all their software. I never though I'd experience the day where I was saying Apple were at the forefront of format support, and Google were the ones acting like a child who want everyone to like their toy and nothing else.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, dalekphalm said:

Feel free to provide counter evidence to how it works.

Evidence: A live photo is not 100MB+ as it would have to be if there'd be several other full-quality frames to chose from. A single still image is 3-5MB in HEIC.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Dracarris said:

Evidence: A live photo is not 100MB+ as it would have to be if there'd be several other full-quality frames to chose from. A single still image is 3-5MB in HEIC.

To be fair, that's an explanation, not evidence, but @Obioban already provided a good explanation with photo evidence.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

I never though I'd experience the day where I was saying Apple were at the forefront of format support, and Google were the ones acting like a child who want everyone to like their toy and nothing else.

I think this comes down to HW encoder/decoder support.  There appears to be rather good JPEG-XL support in modern apple chips and apple have been pushing hard for a HDR image standard (that JPEG-XL supports).   

 

2 hours ago, LAwLz said:

You can also save multiple images with varying dimensions inside the same JPEG XL

Im still looking for a IO library that abstracts this to a level that does not require me to implement my own.  

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, LAwLz said:

It's also worth noting that Google is currently heavily against JPEG XL, going as far as removing support for it in their software. Apple on the other hand is the company with the best support for JPEG XL, supporting it in pretty much all their software. I never though I'd experience the day where I was saying Apple were at the forefront of format support, and Google were the ones acting like a child who want everyone to like their toy and nothing else.

 

It's amazing how Google's solutions are always garbage and the only reason they can push them at all is by taking away the choice to not use them.

 

JPEG XL should be the de-facto replacement for JPEG. They only gave it 2 years. Did you know it took more than 5 years for PNG to replace GIF? And we still do not have a lossless animated alphachannel format?

 

Hell we don't even have a lossy alphachannel format. Not that there would ever be a desire for such as a still image. But there is for video, and the only format that supports Alpha channel is ProRes422 (ffmpeg prores_ks), which is like 100MB/sec. Not Megabits, MegaBytes. But another format that supports "alpha channel" is SpeedHQ SHQ7/SHQ9 (NDI4)

 

So what is one to do when they need to composite alpha-channels? Well, sucks to be you if you're an animator on Windows. 

 

It's kinda funny in the grand scheme of things, how all the photography and Film (live action) tools pay lip service to alpha channel needs. HEIC supports storing "metadata" for alpha, but that doesn't mean it will survive a pass through a tool that modifies the image.

 

H.265 "can" support alpha channel, in a hugely hacky way, which is why HEIC is still a hacky way to do it.

 

AVIF. Same problem as HEIC and WebP in this regard. These are not "image" formats, these a "Video formats" that require Video decoder paths, to render a still image. WebP, HEIC, AVIF should never have been "image formats", because what they really are, are containers for video data that is at the discretion of the runtime, in which everything is optional. So one browser might render it in software and be slow, and another might use a hardware decoder but doesn't support alpha.

 

So yes, JPEGXL makes sense as a still image format, because it looses the baggage of being a video format first. That makes it much more suitable for portable applications and still image software.  That means tools like Photoshop, stand-alone don't need to have video decoder licenses. Likewise all other photo and drawing tools.

 

But we mustn't forget the reason why PNG exists. There was political pushback against patented file formats. There is no reason to adopt a format that is encumbered by patents if the existing stuff we have works already. PNG replaced GIF as a still image, but we didn't get "MNG", the accompanying animated format in the browser, and it was subsequently removed by mozilla for "browser bloat" reasons, despite the fact that it was essentially just "animated PNG". So they developed APNG which is just PNG but animated like GIF (sequence of frames) instead of the more complex MNG which had some features that made it able to replace Flash.

https://bugzilla.mozilla.org/show_bug.cgi?id=18574#c72

 

Mozilla killed MNG over it being "300KB", meanwhile both Firefox and Chrome it ship with AVCODEC which supports JpegXL, a library that takes up several megabytes when compiled.

 

Killing MNG in Mozilla also killed support for JNG, which was a PNG-like container for JPEG data.

WebP, is terrible, and should never have come into existence. With the death of Flash, there is still a void left

 

Like just to point out the elephant in the room. Chrome's attempt to remove JPEGXL is like Mozilla's removal of MNG. The removal of the support, resulted in tools to make the files to cease development because nobody could then view the files. Chrome removing JPEGXL is extremely flimsy because it's part of AVCodec. They would have to make an actual choice to remove it from their fork of AVCodec.

https://chromium.googlesource.com/chromium/third_party/ffmpeg/+/refs/heads/master/configure

 

It's not like Chrome is independently linking all these libraries directly into Chrome anymore more than Firefox is.

 

But what drives support for an image support is people being able to share it, and you nerf that from the browser, and then people won't use it.

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, hishnash said:

I think this comes down to HW encoder/decoder support.  There appears to be rather good JPEG-XL support in modern apple chips and apple have been pushing hard for a HDR image standard (that JPEG-XL supports).   

I don't really buy that argument for why Apple supports it and Google doesn't, because as far as I know there is no hardware acceleration going on with JPEG XL (at least not at this time), and in terms of encoding and decoding, JPEG XL is way faster than AVIF which is what Google is pushing.

 

If it was a performance thing that kept Google from implementing it, they would definitely side with JPEG XL over AVIF.

 

 

6 hours ago, hishnash said:

Im still looking for a IO library that abstracts this to a level that does not require me to implement my own.  

It's a very new format, and just like before we see that replacing an image format is a very difficult task.

If it takes off you will most likely see software implementations that do what you want them to do. But we will see if that happens.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Kisai said:

JPEG XL should be the de-facto replacement for JPEG. They only gave it 2 years.

If you're talking about Google then it's even worse than that, because they didn't give it a fair chance during those two years.

They implemented it as a flag but kept it turned off by default. Then they neglected it (it took them about 2 years to fix a bug that caused a 3x performance drop because it treated static images as being animated) and then they discontinued it citing "lack of usage". No shit people won't use it if you don't even enable it by default, needing users to go into a hidden menu to enable it.

 

 

5 hours ago, Kisai said:

And we still do not have a lossless animated alphachannel format?

JPEG XL can do that, but if you want something today you can use APNG.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×