I stopped releasing binaries for a number of my tools because I didn't want to pay the $100 a year for the right to do so, and I got tired of explaining how to run them without signing.
The same thing exists on Windows, developers have to code sign their binaries. It's even worse in my experience because you have to use a token (usb key with cryptographic signing keys in it) and that's impractical if you want your ci/cd to run in a datacenter. At my company we had a mac mini with a windows VM and a code signing token plugged in just for the purpose of signing our macos and windows binaries.
Another solution that is not mentioned in the article is that users of both macos and windows should be able to easily integrate the certificate of a third-party editor, with a process integrated in their OS explaining the risks, but also making it a process that can be understood and trusted, so that editors can self-sign their own binaries at no cost without needing the approval of the OS editor. Such a tool should ideally be integrated in the OS, but ultimately it could also be provided by a trusted third-party.
I struggled with a similar problem recently. You can use osslsigncode to sign Windows binaries from Linux. It is also possible, with some pissing about, to get everything to work hands off.
In the end we went with Digicert Keylocker to handle the signing, using their CLI tool which we can run on Linux. For our product we generate binaries on the fly when requested and then sign them, and it's all done automatically.
Azure Key Vault - even in the ‘premium’ HSM flavour can’t actually prove the HSM exists or is used, which doesn’t satisfy the requirements the CA has. In theory, it shouldn’t work - but some CAs choose to ignore the letter and the spirit of the rules.
Even Azure’s $2400a month managed HSM isn’t acceptable, as they don’t run them in FIPS mode.
Highly suggest trying Azure Trusted Signing on a CI system with windows boxes (I use Github). Windows signing was an expensive nightmare before, but is now relatively painless and down to $10/mo (which isn't cheap but is cheaper than the alternatives).
Azure Trusted Signing is a crapshoot. If you can get running, it's easy and fast and great. But if you run into any problems at all during the setup process (and you very well might since their onboarding process is held together with duct tape and twine), you're basically left for dead and unless you're on an enterprise support plan, you're not going to get any help from them at all.
Last time I checked it's still US/Canada only. Luckily I only needed code-signing for an internal app, so we just used our own PKI and pushed the certs over MDM.
Nope. Notarization is not code signing. It’s an extra step, after code signing, where you upload your software to Apple’s servers and wait for their system to approve it. It’s more onerous than code signing alone and, with hindsight, doesn’t seem to have been offering any extra protection.
It's not the same, but in practice it's also not so different. Microsoft keeps track of how many times a certain executable has been run and only after a certain threshold does the executable become openable without hunting for tiny buttons. The kicker: this also applies for signed binaries.
Microsoft will upload these executables to the cloud by default if you use their antivirus engine ("sample collection").
In a way, Microsoft is building the same "notarisarion database", but it's doing so after executables have been released rather than before it. Many vendors and developers will likely add their executables to that "database" by simply running it on a test system.
On the other hand, SmartScreen can be disabled pretty easily, whereas macOS doesn't offer a button to disable notarisarion.
Microsoft's notorisation sounds fully automated and transparent, while Apple's is more political and hands on. Individual apps getting their notorisation slowed down to a glacier pace because the platform owner doesn't like them doesn't seem to happen in Microsoft land.
Wasn't there even a story some time ago about how some completely legit, legal, above-board app to virtualize old (pre OS X) versions of Mac OS got rejected by Apple's notarization process?
I'm honestly not even sure it's about denying competitors anything. It feels more like denying their users. Apple has a long history of intently denying users the ability to do what they want LONG before any potential App Store competitors appeared.
Notarization is the same for macOS and iOS AFAIK. Both platforms have a separate app store review process that's even more strict than the notarization process.
> Notarization is the same for macOS and iOS AFAIK.
Assuming the basic facts are straight, the the linked story explicitly proves this is false:
> UTM says Apple refused to notarize the app because of the violation of rule 4.7, as that is included in Notarization Review Guidelines. However, the App Review Guidelines page disagrees. It does not annotate rule 4.7 as being part of the Notarization Review Guidelines. Indeed, if you select the “Show Notarization Review Guidelines Only” toggle, rule 4.7 is greyed out as not being applicable.
Rule 4.7 is App Review Guidelines for iOS, so this would be a case of failing notarization for iOS App Review Guidelines, which means the policies (and implementation) are different between platforms.
(Of course there's no such thing as "Notarization Review Guidelines" so maybe this whole story is suspect, but rule 4.7 is the App Review Guidelines rule that prohibits emulators.)
The point is that notarization plays the same role for both platforms: checks whose purpose is to make sure that the software won't harm the user's device, unrelated to the App Store review process. Both platforms have an additional App Store review process which is significantly more strict, and the notarization process isn't supposed to involve App Store review for either platform.
When Apple denies notarization for bullshit reasons on one platform, it makes me highly suspicious of their motivation for notarization on all platforms.
Their decision to use the same word for both is enough for me to treat them as the same. Apple has tried to convince people that notarization exists for the user's benefit; the iOS implementation of notarization has convinced me that that's not the case.
The bigger difference is that Apple isn't just checking for malware, it's checking for conformance with various APIs, manifest requirements and so on. Not as strict as the iOS App Store, maybe, but it will refuse to notarize if it detects use of unsanctioned API calls.
You don't even need signing for Microsoft's system to do what it does - it can operate on unsigned code, it's all hash based.
Is there a concrete example of this? We know this isn't blanket policy, because of a recent story (https://news.ycombinator.com/item?id=45376977) that contradicts it. I can't find a reference to any macOS app failing notarization due to API calls.
Notarization doesn't blanket block all access to private APIs; but the notarization process may look for and block certain known accesses in certain cases. This is because notarization is not intended to be an Apple policy enforcement mechanism. It's intended to block malicious software.
So in other words, using private APIs in and of itself isn't an issue. Neither is it an issue if your application is one that serves up adult content, or is an alternate App Store, or anything else that Apple might reject from its own App Store for policy reasons. It's basically doing what you might expect a virus scanner to do.
Yeah, don't disagree with any of that, but I'm looking for explicit evidence that that is true (right now it sounds like it's just an assumption)? E.g., either examples of apps failing notarization due to API calls, or Apple explicitly saying that they analyze API calls. Without that it sounds like we're just guessing?
I have the opposite experience - on macOS you can guarantee what users will see when you distribute your notarized app, while on Windows you cannot for undefined time.
How often do you notarize your apps? Why does the speed matter at all? In my cases it takes 2 seconds for the notarization to complete.
The length of time notarization takes depends primarily upon how large and complicated your app is, and how different is from previous versions of the same application you've previously notarized. The system seems to recognize large blocks of code that it's already analyzed and cleared and doesn't need to re-analyze. How much your binary churns between builds can greatly influence how fast your subsequent notarizations are.
A brand new developer account submitting a brand new application for notarization for the first time can expect the process might take a few days; and it's widely believed that first time notarizations require human confirmation because they do definitely take longer if submitted on a weekend or on a holiday. This is true even for extremely small, trivial applications. (Though I can tell you from personal experience that whatever human confirmation they're doing isn't very deep, because I've had first time notarizations on brand new developer accounts get approved even when notarizing a broken binary that doesn't actually launch.)
And of course sometimes their servers just go to shit and notarizations across the board all take significantly longer than normal, and it's not your fault at all. Apple's developer tooling support is kinda garbage.
“Notarize your macOS software to give users more confidence that the Developer ID-signed software you distribute has been checked by Apple for malicious components. _Notarization_of_macOS_software_is_not_App_Review. The Apple notary service is an automated system that scans your software for malicious content, checks for code-signing issues, and returns the results to you quickly.”
⇒ It seems notarization is static analysis, so they don’t need to launch the process.
Also, in some sense a program that doesn’t launch should pass notarization because, even though it may contain malware, that’s harmless because it won’t run.
It's more akin to an enforced malware scanner, at least in principle, kind of mandatory VirusTotal with a stapled certificate.
In practice though they use it to turn the screws on various API compliance topics, and I'm not sure how effective it is realistically in terms of preventing malware exploits.
> doesn’t seem to have been offering any extra protection.
How would this be measured?
Since no one has pointed it out here, it seems obvious to me that the purpose of the notarization system is mainly to have the code signatures of software so that Apple can remotely disable any malware from running. (Kind of unsavory to some, but probably important in today's world, e.g., with Apple's reach with non-technical users especially?)
Not sure how anyone external to Apple would measure the effectiveness of the system (i.e., without knowing what has been disabled and why).
There's a lot of unsubstantiated rumors in this comment thread, e.g., that notarization on macOS has been deliberately used to block software that isn't malware on macOS. I haven't seen a concrete example of that though?
Disabling malware via hash or signature doesn't require the Notarization step at all. Server can tell clients to not run anything with hash xxyyzz and delete it. I mean, just think about it. If disabling stuff required the Notarization step beforehand, no anti-malware would have existed before Notarization. Nonsense.
I think notarization is just a more automated way to do this approach, e.g., otherwise Apple has to hunt down all the permutations of the binary themselves. It seems like it just simplifies the process? (It makes it a white list not a black list, so it's certainly more aggressive.)
> The same thing exists on Windows, developers have to code sign their binaries.
> Another solution that is not mentioned in the article is that users of both macos and windows
The article is actually about notarization on iOS, which is vastly different from notarization on macOS. On iOS, every app, whether in the App Store or outside the App Store, goes through manual Apple review. But apps distributed outside the App Store have fewer rules.
FTA: “Apple’s complete review of apps – known as “notarisation” process - a mandatory step for distributing any software on its platforms, represents the very gatekeeping behaviour the DMA was written to prevent.”
Notarization doesn’t involve a complete review (https://developer.apple.com/documentation/security/notarizin...: “Notarization of macOS software is not App Review. The Apple notary service is an automated system that scans your software for malicious content, checks for code-signing issues, and returns the results to you quickly.”
I also expect Apple will argue that requiring code to be notarized is explicitly allowed under the DMA, based on section 6.7:
“The gatekeeper shall not be prevented from taking strictly necessary and proportionate measures to ensure that interoperability does not compromise the integrity of the operating system, virtual assistant, hardware or software features provided by the gatekeeper, provided that such measures are duly justified by the gatekeeper.”
So, the discussion would have to be on whether this is strictly necessary and proportionate, and whether Apple duly justified that.
I think “strictly necessary” is a bit at odds with defense in depth (https://en.wikipedia.org/wiki/Defense_in_depth_(computing)), where you explicitly add redundancy to improve security, so we’ll see how a judge rules that, but I can see them accepting it if Apple argues they’ll implement a similar feature on-device instead if they have to.
Suffered that back in the day with an Electron desktop app. Not to mention that the notarization and signing integration itself is completely broken. The first time you submit a binary it can take DAYS to process, and setting everything up to work properly with GitHub Actions CI/CD is absurdly time-consuming. It's ridiculous, and if you add this new notarial verification policy on top of that... In the end it's just Apple being Apple.
Can the fsfe also sur Google to try to prevent them to force the registration of all developer than want to install app on any Android phone outside of the play store?
Again, I would happily donate to such an initiative before it is too late!
I suppose this kind of notarization across all digital platforms will have even more importance once the EU CRA (Cybersecurity Resiliency Act) takes full effect end of 2027.
As an iOS user, I love this and you are free to hate me for it. It keeps my grandma safer from scams. This is why I bought her an iPhone.
I don't want to hear any of the usual "don't use sideloading if you don't like it". I don't want it to exist so nobody can talk my grandma into installing a fake bank app over the phone, like they did to her once when she had an android phone and stole all her money.
Yes this is not foolproof still, some scam apps might make it past notarization. Just like cover fees in clubs and gates in gated communities -- it does not keep all the riff-raff away, but it helps.
this has to be a meme at this point, it's the apple shill's version of "someone think of the children". millions of people should be deprived of their consumer rights, because of your fabricated hypotheticals regarding your grandma's smartphone use? I don't think so. just buy her a "dumb phone"[1] instead and be 100% safe; should you actually be concerned and if you aren't just concern trolling that is. the regurgitation of this fear mongering propaganda in defense of apple's anti-competitive business practices is just unserious.
"Fabricated hypotheticals"??? How did you like living through the 1990s and early 2000's when Windows was an unfettered vector for viruses. Your position is elitist at best. Only the anointed few who know how to make keep their systems safe from exploits shall have access to computing. Ask your friends in who are not in the software business how they like checking the cryptographic signatures of the binaries that are about to install from the command line. What they don't know how? Well no compute for them.
Free/libre refers to user freedom. Mandatory licensing would restrict developer freedom in favor of user freedom, a common feature of consumer protection laws.
The submitted article is about iOS, not macOS. Apple unfortunately used the same word "notarization" on both platforms, but the processes are not even remotely similar. Perhaps the confusion was deliberate, but in any case, many commenters here are confused and mistakenly believe that iOS notarization is like macOS notarization.
iOS notarization is still manual review by Apple, but with fewer rules and restrictions.
> If you’ve opted into alternative distribution for customers in the European Union, you can choose to make your app version eligible for distribution on alternative app marketplaces or websites only by selecting to have it evaluated based on the Notarization Review Guidelines (a subset of the App Review Guidelines). Otherwise, App Review uses App Review Guidelines to evaluate your app version to make it eligible for distribution on the App Store, alternative app marketplaces, and websites if approved.
DMA is about increasing competition of app stores. It is not about giving "freedom" to people. Notorization is an independent process from running an app store on Apple's platform.
Notarization doesn't involve any sort of editorial control. It's just a virus scanner that's run up front and then stapling an attestation to your application that it passed the scan. It does not involve looking at the content of your app and making any value judgements about it; it's purely an automated static analysis system checking your application for known malicious code.
UTM wasn't denied notarization because some virus scanner found that it was a virus, but because it violated App Store guidelines. That's editorial control.
You're talking about notarization on macOS. Notarization on iOS is vastly different. On iOS, notarization is more or less App Store review but with fewer rules.
Honestly, iOS notarization really muddied the waters. IMO, because Apple decided to name them the same and thus presumably considers them the same, we should be just as critical of and worried about notarization on the Mac as we are of notarization on iOS.
Everything you can get in an alternative app store has to be approved by Apple and they only approve stuff they'd allow in their store, making it not an alternative.
Software freedom, at least for end users, is a smokescreen, too. I can revert your argument: "you want more ransomware because of a few OSS enthousiasts?" What we need is a way to curb the excesses, such as high entrance barriers to the store.
A phone/tablet is a tool, with very intense usage, and huge privacy value, not an engineer's toy.
The real smokescreen is this freedom vs security false dichotomy. If you give up freedom for the promise of security, you get neither. Look at the App Store. It's full of harmful garbage designed to extract value and waste your time by any trick necessary. It's one step short of ransomware. Oh, unless you use an app for your important documents, then it comes under new management and demands you start paying monthly or lose your stuff. Suddenly that lack of freedom to continue using an old version of the app or to dig around its internals and pull out your data becomes a loss of security. It's fine though, because this type of ransomware is totally legal and inline with your benevolent platform dictator's policies.
Your argument falls apart when you consider iPhones' 60% market share. People have spoken out about whether they want dangerous, uncontrolled third-party apps on their phones.
I don't care about what the riff-raff think, it is morally wrong and defies human freedom and dignity to require everyone walk around with a locked-down surveilance device in their pocket in order to function in the economy.
60% of society could be raptured tomorrow and the world would be better off.
This is called the tyranny of the majority, where you're arguing that because most people don't care about freedom, therefore freedom doesn't have value. It's not a sound argument, much like saying freedom of speech doesn't matter because most people have nothing to say.
Editing to add: it seems particularly ironic that you think iPhone users make great purchasing decisions when they buy the phone, but are incapable of making good decisions when selecting software. What accounts for the discrepancy?
Most people are stupid and short-sighted. Pointing to the stupid in support of your argument doesn't help it.
And, the app store does absolutely nothing to prevent "dangerous" apps. Apple doesn't review the code. In fact, if your code is reviewable, it's even harder to get it on the app store.
At the end of the day, the App Store and Play Store are filled with adware, spyware, and other malware - because Apple and Google like it that way. That's what they want. They don't give a single flying fuck about your security. They care about extracting 30% while simultaneously doing as little as possible. That's completely at odds with security, yes, and they know that. They just don't care.
Just in case you unironically don't understand this and aren't just playing it up:
Allowing third party installations does not mean uncontrolled third party apps. It merely means users have to option to install software on their phones - which continues to limit the softwares capabilities until the user was prompted to allow each.
You could argue "but a braindead person can randomly go on a phishing website, randomly download some .app file and suddenly - through magic go through a theoretical installation dialog to finally explicitly grant this malware problematic permissions... And I'm sure there are going to be people that will do exactly that... But without it, they'll still manage to do the same to the same effect, just without the app installation by inputting their bank credentials in a phishing site or similar
The thing your citing as a problem solved by disallowing app installs isn't actually solved - and it would not become more problematic either.
Finally, the fact of the matter remains that almost nobody would actually use the capability to install from third party stores, as you've correctly insinuated. But if anything, that should be another proof that allowing third party installs doesn't reduce security.
People just like to have everything provided to them from a single source, and will usually pay a premium for that.
What point are you even trying to make? That's not a counter-argument unless you assume that people in aggregate always make great purchasing decisions. Wait until you hear about cigarettes, heroine, slot machines, snake oil, tulips, and the rest of the effectively infinite list of fun and unique ways people make terrible choices or are bamboozled into acting against their own and others' interests. This is a comment thread about protecting people from scams. The premise acknowledges that people make widespread poor decisions. Is it so unthinkable that buying an iPhone is one of them?
They are using it as a proxy for "people with low technical skills" (which is a specious argument since it was a friend of my parents who got me into programming and he remains one of the best I've ever known) and making the usual argument that we should limit control of our devices to make it safe for them.
I actually don't have (much) of an issue with walled garden approaches as long as the wall has a gate that is easily opened, give me an OS level toggle with a warning of "Here be dragons" and I can live with it - it's not ideal but it's not a terrible trade off.
It's something Android has had previously (but they seem to be trying to lock that gate) and iOS less so.
How about instead of a single os level toggle you get a trillion dollar company, renowned for their high quality design, invested in providing the best possible UX while respecting the user as the owner of the device?
Which is something I find very annoying, because I know a lot of people who are parents (or adults) or grandparents which have greater technical skills than their children.
They don’t. You can still run any software you’d like. You just get warnings, so people like parents don’t just randomly open malicious programs from the internet.
App developers do know. I can't say that I've ever worked on an app where this request has been made. Neither the App Store Connect Agreement[0] nor the Apple Developer Agreement[1] stipulates that the developer can be compelled to surrender their source code.
All the relevant agreements can be found here, so if there's something that specifies this kind of overreach, I'd both be very surprised and interested.
“If you are required by law, regulation, or court order to disclose any Apple Confidential Information (which can include requests related to legal investigations or audits), you agree to give Apple prompt notice and to cooperate in seeking a protective order or confidential treatment of such information”
Right, if we could educate users on the tools they use, and if the trillion dollar companies could provide tools to help community members protect each other, we wouldn't be here. Apple doesn't have to be a dictator if they would help the community support each other. Instead they took the easy way out of stripping freedoms from everyone so they can control every device out there. It's a minor inconvenience to be involved in protecting vulnerable people in our community, it's tragic that people just said Apple should take that role.
> I still don’t see why you would want your parents to run untrusted software on their devices, but you do you I guess.
I don't trust Apple's App Store review. They've approved countless scams that have tricked Apple users out of a lot of money, perhaps $billions in total.
Sadly about 98% of real world users are going to fall into scams, ransomwares and stuff. They are not mentally challenged, there are just so many traps/fakes/tempting stuff that we as IT people are more aware of (but even we still fall into some).
We also can't count on every person being able to check every single thing they do: how do you check if some food or drug you get is good or not? you can't really, you have to trust someone who knows.
It’s a bit like the Elizabeth Warren toaster analogy. If you bought a toaster with shoddy wiring and it caught fire and burned down your house, everyone would blame the manufacturer and not sneer at you online for not learning electrical engineering and not checking the wiring yourself before using it.
It's more like if I buy a reliable toaster, but I buy bread that's secretly poisoned by the manufacturer and hurt myself. I'm not gonna demand the toaster maker add a poison sensor to the toaster and say "how dare they didn't protect me!"
I don't buy this in the first place. It is reasonable to expect consumers to do some background research into the products they buy. In fact, it is the only way capitalism can function as a meritocracy.
Society should be more dangerous as a means to force people to learn more about technology they rely on.
How can we trust software anymore? Open source projects are being sold to bad actors. Python default repos are full of malware. Originally blessed and trusted apps are being bought by software companies is dodgy countries. It seems like we can only trust big software companies like Microsoft and Oracle.
I'm building an application that allows you to send a file to your colleagues. That's hardly a revolutionary or unusual use case, and it definitely requires network access and full access to the local file system. I also need the ability to lock files, writing file locks anywhere on the system, and I need to be able to index the contents of files.
Not only are all of these functions and corresponding permissions completely standard for all kinds of applications, they belong to the core of what any system that calls itself an "operating system" should deliver to developers and end users.
You can see it in action. I have a M1 Ultra Mac Studio, an insanely powerful machine, and when building open source software, actual compilation flies but the autonomy step crawls because IIT has to build test binaries to test OS features and notarization slows that down dramatically.
Notarization is completely optional when building any OSS software on a Mac, and not part of any default build process I know. A Mac can sign builds for running locally, a process which is fast, completely local, and does require building test binaries or anything like that. Even a Mac building for an iPhone in developer mode has a local cert it can use, and doesn't require notarization.
Notarization is only needed when distributing binaries to others. Personally I do it once a month for the Mac app I distribute.
I stopped releasing binaries for a number of my tools because I didn't want to pay the $100 a year for the right to do so, and I got tired of explaining how to run them without signing.
The post I wrote to point people at anyway:
https://donatstudios.com/mac-terminal-run-unsigned-binaries
Note that the submitted article is about iOS, not macOS. The "notarization" process on iOS shares practically nothing with macOS except the name:
https://developer.apple.com/help/app-store-connect/managing-...
iOS notarization is just app review with fewer rules.
Doesn't it also require the same $100 annual fee?
Yes, but again, this has nothing to do with the submitted article.
The same thing exists on Windows, developers have to code sign their binaries. It's even worse in my experience because you have to use a token (usb key with cryptographic signing keys in it) and that's impractical if you want your ci/cd to run in a datacenter. At my company we had a mac mini with a windows VM and a code signing token plugged in just for the purpose of signing our macos and windows binaries.
Another solution that is not mentioned in the article is that users of both macos and windows should be able to easily integrate the certificate of a third-party editor, with a process integrated in their OS explaining the risks, but also making it a process that can be understood and trusted, so that editors can self-sign their own binaries at no cost without needing the approval of the OS editor. Such a tool should ideally be integrated in the OS, but ultimately it could also be provided by a trusted third-party.
If I try and run an unsigned program, the UAC window will be yellow, but I can run it with zero issue.
I cannot do the same thing on MacOS with the same ease, and that's the issue.
I struggled with a similar problem recently. You can use osslsigncode to sign Windows binaries from Linux. It is also possible, with some pissing about, to get everything to work hands off.
In the end we went with Digicert Keylocker to handle the signing, using their CLI tool which we can run on Linux. For our product we generate binaries on the fly when requested and then sign them, and it's all done automatically.
Just FYI, you don’t have to use a USB stick, you can also use HSM like azure key vault and sign using azure signtool.
Azure Key Vault - even in the ‘premium’ HSM flavour can’t actually prove the HSM exists or is used, which doesn’t satisfy the requirements the CA has. In theory, it shouldn’t work - but some CAs choose to ignore the letter and the spirit of the rules. Even Azure’s $2400a month managed HSM isn’t acceptable, as they don’t run them in FIPS mode.
Highly suggest trying Azure Trusted Signing on a CI system with windows boxes (I use Github). Windows signing was an expensive nightmare before, but is now relatively painless and down to $10/mo (which isn't cheap but is cheaper than the alternatives).
Azure Trusted Signing is a crapshoot. If you can get running, it's easy and fast and great. But if you run into any problems at all during the setup process (and you very well might since their onboarding process is held together with duct tape and twine), you're basically left for dead and unless you're on an enterprise support plan, you're not going to get any help from them at all.
Last time I checked it's still US/Canada only. Luckily I only needed code-signing for an internal app, so we just used our own PKI and pushed the certs over MDM.
Nope. Notarization is not code signing. It’s an extra step, after code signing, where you upload your software to Apple’s servers and wait for their system to approve it. It’s more onerous than code signing alone and, with hindsight, doesn’t seem to have been offering any extra protection.
It's not the same, but in practice it's also not so different. Microsoft keeps track of how many times a certain executable has been run and only after a certain threshold does the executable become openable without hunting for tiny buttons. The kicker: this also applies for signed binaries.
Microsoft will upload these executables to the cloud by default if you use their antivirus engine ("sample collection").
In a way, Microsoft is building the same "notarisarion database", but it's doing so after executables have been released rather than before it. Many vendors and developers will likely add their executables to that "database" by simply running it on a test system.
On the other hand, SmartScreen can be disabled pretty easily, whereas macOS doesn't offer a button to disable notarisarion.
Microsoft's notorisation sounds fully automated and transparent, while Apple's is more political and hands on. Individual apps getting their notorisation slowed down to a glacier pace because the platform owner doesn't like them doesn't seem to happen in Microsoft land.
Wasn't there even a story some time ago about how some completely legit, legal, above-board app to virtualize old (pre OS X) versions of Mac OS got rejected by Apple's notarization process?
Yes. Probably this story ?
https://9to5mac.com/2024/06/19/iphone-pc-emulator-block-ille...
“UTM SE” is now on the App Store. Perhaps this was just a mistake?
https://apps.apple.com/us/app/utm-se-retro-pc-emulator/id156...
It was the standard business pattern of denying your competitors everything you can, unless it causes a third-party fuss.
I'm honestly not even sure it's about denying competitors anything. It feels more like denying their users. Apple has a long history of intently denying users the ability to do what they want LONG before any potential App Store competitors appeared.
Note this is an iPhone app (noting because this thread seems to mainly be about macOS).
Notarization is the same for macOS and iOS AFAIK. Both platforms have a separate app store review process that's even more strict than the notarization process.
> Notarization is the same for macOS and iOS AFAIK.
Assuming the basic facts are straight, the the linked story explicitly proves this is false:
> UTM says Apple refused to notarize the app because of the violation of rule 4.7, as that is included in Notarization Review Guidelines. However, the App Review Guidelines page disagrees. It does not annotate rule 4.7 as being part of the Notarization Review Guidelines. Indeed, if you select the “Show Notarization Review Guidelines Only” toggle, rule 4.7 is greyed out as not being applicable.
Rule 4.7 is App Review Guidelines for iOS, so this would be a case of failing notarization for iOS App Review Guidelines, which means the policies (and implementation) are different between platforms.
(Of course there's no such thing as "Notarization Review Guidelines" so maybe this whole story is suspect, but rule 4.7 is the App Review Guidelines rule that prohibits emulators.)
The point is that notarization plays the same role for both platforms: checks whose purpose is to make sure that the software won't harm the user's device, unrelated to the App Store review process. Both platforms have an additional App Store review process which is significantly more strict, and the notarization process isn't supposed to involve App Store review for either platform.
When Apple denies notarization for bullshit reasons on one platform, it makes me highly suspicious of their motivation for notarization on all platforms.
> Notarization is the same for macOS and iOS AFAIK.
It's not. They're totally different. The only thing they share is the word "notarization".
Their decision to use the same word for both is enough for me to treat them as the same. Apple has tried to convince people that notarization exists for the user's benefit; the iOS implementation of notarization has convinced me that that's not the case.
> Their decision to use the same word for both is enough for me to treat them as the same.
Ok... you can believe whatever you want to believe based on one word, or you can read the documentation that Apple has published:
https://developer.apple.com/help/app-store-connect/managing-...
The bigger difference is that Apple isn't just checking for malware, it's checking for conformance with various APIs, manifest requirements and so on. Not as strict as the iOS App Store, maybe, but it will refuse to notarize if it detects use of unsanctioned API calls.
You don't even need signing for Microsoft's system to do what it does - it can operate on unsigned code, it's all hash based.
> detects use of unsanctioned API calls
Is there a concrete example of this? We know this isn't blanket policy, because of a recent story (https://news.ycombinator.com/item?id=45376977) that contradicts it. I can't find a reference to any macOS app failing notarization due to API calls.
Notarization doesn't blanket block all access to private APIs; but the notarization process may look for and block certain known accesses in certain cases. This is because notarization is not intended to be an Apple policy enforcement mechanism. It's intended to block malicious software.
So in other words, using private APIs in and of itself isn't an issue. Neither is it an issue if your application is one that serves up adult content, or is an alternate App Store, or anything else that Apple might reject from its own App Store for policy reasons. It's basically doing what you might expect a virus scanner to do.
Yeah, don't disagree with any of that, but I'm looking for explicit evidence that that is true (right now it sounds like it's just an assumption)? E.g., either examples of apps failing notarization due to API calls, or Apple explicitly saying that they analyze API calls. Without that it sounds like we're just guessing?
> it will refuse to notarize if it detects use of unsanctioned API calls.
Or really any reason. They're not supposed to exert editorial control but that's how it has been happening in practice.
I have the opposite experience - on macOS you can guarantee what users will see when you distribute your notarized app, while on Windows you cannot for undefined time.
How often do you notarize your apps? Why does the speed matter at all? In my cases it takes 2 seconds for the notarization to complete.
The article is about iOS, and getting your notorization in 2 seconds or weeks is IMHO a big difference.
There's obviously simple cases where the iOS notorization also flies in 2 secs, but there seems to be enough tougher cases:
https://www.reddit.com/r/iOSProgramming/comments/1l9m7jd/how...
The length of time notarization takes depends primarily upon how large and complicated your app is, and how different is from previous versions of the same application you've previously notarized. The system seems to recognize large blocks of code that it's already analyzed and cleared and doesn't need to re-analyze. How much your binary churns between builds can greatly influence how fast your subsequent notarizations are.
A brand new developer account submitting a brand new application for notarization for the first time can expect the process might take a few days; and it's widely believed that first time notarizations require human confirmation because they do definitely take longer if submitted on a weekend or on a holiday. This is true even for extremely small, trivial applications. (Though I can tell you from personal experience that whatever human confirmation they're doing isn't very deep, because I've had first time notarizations on brand new developer accounts get approved even when notarizing a broken binary that doesn't actually launch.)
And of course sometimes their servers just go to shit and notarizations across the board all take significantly longer than normal, and it's not your fault at all. Apple's developer tooling support is kinda garbage.
> I've had first time notarizations on brand new developer accounts get approved even when notarizing a broken binary that doesn't actually launch
https://developer.apple.com/documentation/security/notarizin... (emphasis added):
“Notarize your macOS software to give users more confidence that the Developer ID-signed software you distribute has been checked by Apple for malicious components. _Notarization_of_macOS_software_is_not_App_Review. The Apple notary service is an automated system that scans your software for malicious content, checks for code-signing issues, and returns the results to you quickly.”
⇒ It seems notarization is static analysis, so they don’t need to launch the process.
Also, in some sense a program that doesn’t launch should pass notarization because, even though it may contain malware, that’s harmless because it won’t run.
I went through the comment there, all of those look like the most likely explanation is just bugs in the notarization system.
>It’s more onerous than code signing alone and ...
I don't know, I sometimes contemplated sticking sharpened pencils in my eyes for light relief whilst trying to renew my code signing certificates.
It's more akin to an enforced malware scanner, at least in principle, kind of mandatory VirusTotal with a stapled certificate.
In practice though they use it to turn the screws on various API compliance topics, and I'm not sure how effective it is realistically in terms of preventing malware exploits.
> In practice though they use it to turn the screws on various API compliance topics
Do you have an example of this on macOS?
> doesn’t seem to have been offering any extra protection.
How would this be measured?
Since no one has pointed it out here, it seems obvious to me that the purpose of the notarization system is mainly to have the code signatures of software so that Apple can remotely disable any malware from running. (Kind of unsavory to some, but probably important in today's world, e.g., with Apple's reach with non-technical users especially?)
Not sure how anyone external to Apple would measure the effectiveness of the system (i.e., without knowing what has been disabled and why).
There's a lot of unsubstantiated rumors in this comment thread, e.g., that notarization on macOS has been deliberately used to block software that isn't malware on macOS. I haven't seen a concrete example of that though?
Disabling malware via hash or signature doesn't require the Notarization step at all. Server can tell clients to not run anything with hash xxyyzz and delete it. I mean, just think about it. If disabling stuff required the Notarization step beforehand, no anti-malware would have existed before Notarization. Nonsense.
I think notarization is just a more automated way to do this approach, e.g., otherwise Apple has to hunt down all the permutations of the binary themselves. It seems like it just simplifies the process? (It makes it a white list not a black list, so it's certainly more aggressive.)
> The same thing exists on Windows, developers have to code sign their binaries.
> Another solution that is not mentioned in the article is that users of both macos and windows
The article is actually about notarization on iOS, which is vastly different from notarization on macOS. On iOS, every app, whether in the App Store or outside the App Store, goes through manual Apple review. But apps distributed outside the App Store have fewer rules.
You can virtualize an HSM FWIW.
I stopped prostituting myself for Apple a long time ago.
Glad more developers are seeing the light now.
Same thing, I jumped off iOS/macos development long time ago, it was probably the best career decision.
FTA: “Apple’s complete review of apps – known as “notarisation” process - a mandatory step for distributing any software on its platforms, represents the very gatekeeping behaviour the DMA was written to prevent.”
Notarization doesn’t involve a complete review (https://developer.apple.com/documentation/security/notarizin...: “Notarization of macOS software is not App Review. The Apple notary service is an automated system that scans your software for malicious content, checks for code-signing issues, and returns the results to you quickly.”
I also expect Apple will argue that requiring code to be notarized is explicitly allowed under the DMA, based on section 6.7:
“The gatekeeper shall not be prevented from taking strictly necessary and proportionate measures to ensure that interoperability does not compromise the integrity of the operating system, virtual assistant, hardware or software features provided by the gatekeeper, provided that such measures are duly justified by the gatekeeper.”
So, the discussion would have to be on whether this is strictly necessary and proportionate, and whether Apple duly justified that.
I think “strictly necessary” is a bit at odds with defense in depth (https://en.wikipedia.org/wiki/Defense_in_depth_(computing)), where you explicitly add redundancy to improve security, so we’ll see how a judge rules that, but I can see them accepting it if Apple argues they’ll implement a similar feature on-device instead if they have to.
> “Notarization of macOS software
The submitted article is about notarization on iOS, which is vastly different from notarization on macOS.
It's a shame that Apple used the same word for both platforms, because it appears to be confusing everyone. Maybe that was deliberate...
Suffered that back in the day with an Electron desktop app. Not to mention that the notarization and signing integration itself is completely broken. The first time you submit a binary it can take DAYS to process, and setting everything up to work properly with GitHub Actions CI/CD is absurdly time-consuming. It's ridiculous, and if you add this new notarial verification policy on top of that... In the end it's just Apple being Apple.
Can the fsfe also sur Google to try to prevent them to force the registration of all developer than want to install app on any Android phone outside of the play store?
Again, I would happily donate to such an initiative before it is too late!
I suppose this kind of notarization across all digital platforms will have even more importance once the EU CRA (Cybersecurity Resiliency Act) takes full effect end of 2027.
As an iOS user, I love this and you are free to hate me for it. It keeps my grandma safer from scams. This is why I bought her an iPhone.
I don't want to hear any of the usual "don't use sideloading if you don't like it". I don't want it to exist so nobody can talk my grandma into installing a fake bank app over the phone, like they did to her once when she had an android phone and stole all her money.
Yes this is not foolproof still, some scam apps might make it past notarization. Just like cover fees in clubs and gates in gated communities -- it does not keep all the riff-raff away, but it helps.
this has to be a meme at this point, it's the apple shill's version of "someone think of the children". millions of people should be deprived of their consumer rights, because of your fabricated hypotheticals regarding your grandma's smartphone use? I don't think so. just buy her a "dumb phone"[1] instead and be 100% safe; should you actually be concerned and if you aren't just concern trolling that is. the regurgitation of this fear mongering propaganda in defense of apple's anti-competitive business practices is just unserious.
[1] https://www.dumbphones.org/
"Fabricated hypotheticals"??? How did you like living through the 1990s and early 2000's when Windows was an unfettered vector for viruses. Your position is elitist at best. Only the anointed few who know how to make keep their systems safe from exploits shall have access to computing. Ask your friends in who are not in the software business how they like checking the cryptographic signatures of the binaries that are about to install from the command line. What they don't know how? Well no compute for them.
Mandatory FLOSS and open hardware is SERIOUSLY the sole way we can evolve positively.
Mandatory != free/libre
Free/libre refers to user freedom. Mandatory licensing would restrict developer freedom in favor of user freedom, a common feature of consumer protection laws.
Freedom always comes at a cost.
Absolute freedom does not exist.
Etc.
The submitted article is about iOS, not macOS. Apple unfortunately used the same word "notarization" on both platforms, but the processes are not even remotely similar. Perhaps the confusion was deliberate, but in any case, many commenters here are confused and mistakenly believe that iOS notarization is like macOS notarization.
iOS notarization is still manual review by Apple, but with fewer rules and restrictions.
https://developer.apple.com/help/app-store-connect/managing-...
> If you’ve opted into alternative distribution for customers in the European Union, you can choose to make your app version eligible for distribution on alternative app marketplaces or websites only by selecting to have it evaluated based on the Notarization Review Guidelines (a subset of the App Review Guidelines). Otherwise, App Review uses App Review Guidelines to evaluate your app version to make it eligible for distribution on the App Store, alternative app marketplaces, and websites if approved.
In the end, it's the same for Windows too since you need to pay for a cert.
Where are all these "Apple can do whatever they want on their platform" bootlickers?
DMA is about increasing competition of app stores. It is not about giving "freedom" to people. Notorization is an independent process from running an app store on Apple's platform.
What if I want an app store that doesn't require notarization?
Well, it gives Apple editorial control over non-Apple app stores.
Notarization doesn't involve any sort of editorial control. It's just a virus scanner that's run up front and then stapling an attestation to your application that it passed the scan. It does not involve looking at the content of your app and making any value judgements about it; it's purely an automated static analysis system checking your application for known malicious code.
This is just factually incorrect. See: https://9to5mac.com/2024/06/09/apple-blocks-pc-emulator-utm-...
UTM wasn't denied notarization because some virus scanner found that it was a virus, but because it violated App Store guidelines. That's editorial control.
You're talking about notarization on macOS. Notarization on iOS is vastly different. On iOS, notarization is more or less App Store review but with fewer rules.
Honestly, iOS notarization really muddied the waters. IMO, because Apple decided to name them the same and thus presumably considers them the same, we should be just as critical of and worried about notarization on the Mac as we are of notarization on iOS.
Everything you can get in an alternative app store has to be approved by Apple and they only approve stuff they'd allow in their store, making it not an alternative.
I still don’t see why you would want your parents to run untrusted software on their devices, but you do you I guess.
This argument is in the same vein as “chat control because of child safety”.
Its a smokescreen.
You want less liberty because of the “least competent” user?
Software freedom, at least for end users, is a smokescreen, too. I can revert your argument: "you want more ransomware because of a few OSS enthousiasts?" What we need is a way to curb the excesses, such as high entrance barriers to the store.
A phone/tablet is a tool, with very intense usage, and huge privacy value, not an engineer's toy.
The real smokescreen is this freedom vs security false dichotomy. If you give up freedom for the promise of security, you get neither. Look at the App Store. It's full of harmful garbage designed to extract value and waste your time by any trick necessary. It's one step short of ransomware. Oh, unless you use an app for your important documents, then it comes under new management and demands you start paying monthly or lose your stuff. Suddenly that lack of freedom to continue using an old version of the app or to dig around its internals and pull out your data becomes a loss of security. It's fine though, because this type of ransomware is totally legal and inline with your benevolent platform dictator's policies.
Your argument falls apart when you consider iPhones' 60% market share. People have spoken out about whether they want dangerous, uncontrolled third-party apps on their phones.
I don't care about what the riff-raff think, it is morally wrong and defies human freedom and dignity to require everyone walk around with a locked-down surveilance device in their pocket in order to function in the economy.
60% of society could be raptured tomorrow and the world would be better off.
This is called the tyranny of the majority, where you're arguing that because most people don't care about freedom, therefore freedom doesn't have value. It's not a sound argument, much like saying freedom of speech doesn't matter because most people have nothing to say.
Editing to add: it seems particularly ironic that you think iPhone users make great purchasing decisions when they buy the phone, but are incapable of making good decisions when selecting software. What accounts for the discrepancy?
Most people are stupid and short-sighted. Pointing to the stupid in support of your argument doesn't help it.
And, the app store does absolutely nothing to prevent "dangerous" apps. Apple doesn't review the code. In fact, if your code is reviewable, it's even harder to get it on the app store.
At the end of the day, the App Store and Play Store are filled with adware, spyware, and other malware - because Apple and Google like it that way. That's what they want. They don't give a single flying fuck about your security. They care about extracting 30% while simultaneously doing as little as possible. That's completely at odds with security, yes, and they know that. They just don't care.
Just in case you unironically don't understand this and aren't just playing it up:
Allowing third party installations does not mean uncontrolled third party apps. It merely means users have to option to install software on their phones - which continues to limit the softwares capabilities until the user was prompted to allow each.
You could argue "but a braindead person can randomly go on a phishing website, randomly download some .app file and suddenly - through magic go through a theoretical installation dialog to finally explicitly grant this malware problematic permissions... And I'm sure there are going to be people that will do exactly that... But without it, they'll still manage to do the same to the same effect, just without the app installation by inputting their bank credentials in a phishing site or similar
The thing your citing as a problem solved by disallowing app installs isn't actually solved - and it would not become more problematic either.
Finally, the fact of the matter remains that almost nobody would actually use the capability to install from third party stores, as you've correctly insinuated. But if anything, that should be another proof that allowing third party installs doesn't reduce security.
People just like to have everything provided to them from a single source, and will usually pay a premium for that.
What point are you even trying to make? That's not a counter-argument unless you assume that people in aggregate always make great purchasing decisions. Wait until you hear about cigarettes, heroine, slot machines, snake oil, tulips, and the rest of the effectively infinite list of fun and unique ways people make terrible choices or are bamboozled into acting against their own and others' interests. This is a comment thread about protecting people from scams. The premise acknowledges that people make widespread poor decisions. Is it so unthinkable that buying an iPhone is one of them?
It should be a setting (like macos) otherwise full control of all the devices is always at the mercy of Apple.
Who said anything about parents?
They are using it as a proxy for "people with low technical skills" (which is a specious argument since it was a friend of my parents who got me into programming and he remains one of the best I've ever known) and making the usual argument that we should limit control of our devices to make it safe for them.
I actually don't have (much) of an issue with walled garden approaches as long as the wall has a gate that is easily opened, give me an OS level toggle with a warning of "Here be dragons" and I can live with it - it's not ideal but it's not a terrible trade off.
It's something Android has had previously (but they seem to be trying to lock that gate) and iOS less so.
I can run anything on my Mac the way you described: go to security settings and tell it know what I am doing. Is that changing somehow?
How about instead of a single os level toggle you get a trillion dollar company, renowned for their high quality design, invested in providing the best possible UX while respecting the user as the owner of the device?
Tell me more about this mythical mobile device and I'll buy one immediately!
> has a gate that is easily opened
Invariably, the argument is: "users will just be instructed to open the gate by the bad guys, so we can't have a gate!"
I think the burden of proof should be on them to show how often this happens.
Which is something I find very annoying, because I know a lot of people who are parents (or adults) or grandparents which have greater technical skills than their children.
The point is that they trust it, whether or not Apple trusts it is completely orthogonal, and irrelevant. Apple doesn't own the phone.
Implying the software in the appstore is ""trusted""
What do your kids say to this?
I still don't see why you would want Apple to have a say in what you run on your device, but you do you, I guess.
They don’t. You can still run any software you’d like. You just get warnings, so people like parents don’t just randomly open malicious programs from the internet.
Which is exactly as it should be
Tell me how I can side load apps on iphone? Even with warnings and stuff.
If you compile it from source yourself using Xcode you can deploy to your own device without an Apple developer subscription.
It unfortunately goes away. Last I checked you get 7 days before the app expires. The subscription makes it last much longer, but not forever.
Because they have thousands of employees who have the time to look at the source code and determine whether it is malicious.
Nobody else would bother. That’s why meme language repositories continuously lead to hacks and vulnerabilities.
Apple absolutely does not manually read all the source code they notarized.
They don't notarize source code at all. They notarize compiled app binaries. Many or even most App Store apps are closed source.
Apple employees have access to the source code of apps on the App Store?
No.
Technically yes, if they want it you have to give it to them. The dev agreement and TOS is pretty broad.
Is that (Apple asking for source) a frequent thing?
We don't know.
App developers do know. I can't say that I've ever worked on an app where this request has been made. Neither the App Store Connect Agreement[0] nor the Apple Developer Agreement[1] stipulates that the developer can be compelled to surrender their source code.
[0] https://appstoreconnect.apple.com/WebObjects/iTunesConnect.w... [1] https://developer.apple.com/support/downloads/terms/apple-de...
All the relevant agreements can be found here, so if there's something that specifies this kind of overreach, I'd both be very surprised and interested.
https://developer.apple.com/support/terms/
“If you are required by law, regulation, or court order to disclose any Apple Confidential Information (which can include requests related to legal investigations or audits), you agree to give Apple prompt notice and to cooperate in seeking a protective order or confidential treatment of such information”
What part of this says Apple can compel developers to share their apps' source with Apple?
Edit: oh, are you saying that such requests would be "Apple confidential information" so nobody would say if it happened?
We do know. It has never happened.
This has literally never happened.
You are mixing up with Fdroid, Apple doesn't do any source code reading and the tests they do are very basic.
Right now you have a lot of piracy apps which are disguised as a "note taking app" and they passed the appstore review without any issues.
Do you have any examples? Asking for a friend.
It's funny how "think of the parents" is the new "think of the children".
It’s tragic how many are baffled by the idea someone might genuinely accept a minor inconvenience to benefit their community.
I strongly dispute that giving megacorporations total control of how we're allowed to use our computing devices is beneficial to any community.
Right, if we could educate users on the tools they use, and if the trillion dollar companies could provide tools to help community members protect each other, we wouldn't be here. Apple doesn't have to be a dictator if they would help the community support each other. Instead they took the easy way out of stripping freedoms from everyone so they can control every device out there. It's a minor inconvenience to be involved in protecting vulnerable people in our community, it's tragic that people just said Apple should take that role.
> I still don’t see why you would want your parents to run untrusted software on their devices, but you do you I guess.
I don't trust Apple's App Store review. They've approved countless scams that have tricked Apple users out of a lot of money, perhaps $billions in total.
Because they're adults that can make their own decisions and not mentally challenged patients under a megacorps guardianship?
Sadly about 98% of real world users are going to fall into scams, ransomwares and stuff. They are not mentally challenged, there are just so many traps/fakes/tempting stuff that we as IT people are more aware of (but even we still fall into some).
We also can't count on every person being able to check every single thing they do: how do you check if some food or drug you get is good or not? you can't really, you have to trust someone who knows.
> how do you check if some food or drug you get is good or not? you can't really, you have to trust someone who knows.
Yes - the democratically elected government, not a monopolistic entity with capital interest.
Then that's their own fault and responsibility. You can't build up immunity without exposure.
It’s a bit like the Elizabeth Warren toaster analogy. If you bought a toaster with shoddy wiring and it caught fire and burned down your house, everyone would blame the manufacturer and not sneer at you online for not learning electrical engineering and not checking the wiring yourself before using it.
It's more like if I buy a reliable toaster, but I buy bread that's secretly poisoned by the manufacturer and hurt myself. I'm not gonna demand the toaster maker add a poison sensor to the toaster and say "how dare they didn't protect me!"
I don't buy this in the first place. It is reasonable to expect consumers to do some background research into the products they buy. In fact, it is the only way capitalism can function as a meritocracy.
Society should be more dangerous as a means to force people to learn more about technology they rely on.
How can we trust software anymore? Open source projects are being sold to bad actors. Python default repos are full of malware. Originally blessed and trusted apps are being bought by software companies is dodgy countries. It seems like we can only trust big software companies like Microsoft and Oracle.
Oracle has made several open source softwares closed source. Do not trust. At all.
Why?
I think you missed a /s marker. Big companies, trustworthy? And your examples are Oracle and Microsoft?
The only way is to run everything is strict sandboxes. E.g. for a photo editor there's absolutely no reason to open any network connections.
I'm building an application that allows you to send a file to your colleagues. That's hardly a revolutionary or unusual use case, and it definitely requires network access and full access to the local file system. I also need the ability to lock files, writing file locks anywhere on the system, and I need to be able to index the contents of files.
Not only are all of these functions and corresponding permissions completely standard for all kinds of applications, they belong to the core of what any system that calls itself an "operating system" should deliver to developers and end users.
You can see it in action. I have a M1 Ultra Mac Studio, an insanely powerful machine, and when building open source software, actual compilation flies but the autonomy step crawls because IIT has to build test binaries to test OS features and notarization slows that down dramatically.
Notarization is completely optional when building any OSS software on a Mac, and not part of any default build process I know. A Mac can sign builds for running locally, a process which is fast, completely local, and does require building test binaries or anything like that. Even a Mac building for an iPhone in developer mode has a local cert it can use, and doesn't require notarization.
Notarization is only needed when distributing binaries to others. Personally I do it once a month for the Mac app I distribute.