For example almost all electron apps keep the last two or so versions around when they auto-update. And since Electron apps are around 250 MB that's 0.5 GB of old versions per app.
I recently discovered a big cache - in terms of file number, 6000 files - of basically every edit I did in the last 6 months in VS Code - called Local History. Also, every directory you launch VS Code in will have a workspace cache associated with it, it can be 200 MB if you have C++ files in it, and VS Code will not remove it when you delete the original dir.
And so on...
I wish they would all integrate into a master clear caches system app, where you would have the opportunity of seeing them and clearing them. A bit like on Android.
Xcode is a huge offender, at least for me. At 36GB, not only is it the largest application in my Applications folder (it is larger than all my other applications combined), it also litters /Library/Developer with about 9GB of shit (mostly simulator runtimes), and then litters ~/Library/Developer with 33GB of shit (mostly caches). Almost 80GB disk footprint... just to develop software?
Except Xcode can't possibly know before hand when I do need the space back. Let's say I have 10GB free, so Xcode says "Well, you just don't have low enough disk space to need it deleted" while I'm trying to install Premiere Pro that requires 30GB of space (just an example), but I can't, because I don't have enough disk space.
How about not trying to guess when the user needs the space?
This is the same problem I have with the "unused RAM is wasted RAM" argument. So long as a system has a maximum of one application using this strategy, it works. As soon as you have two independent programs each of which try to cache as much as possible, it falls apart.
For a while, the browser could reasonably assume that it was the only application that used this strategy, and it was a true assumption. Now that native executables are being replaced by web-based applications, every program is trying to maximize its RAM usage, with predictable results.
Similar assumptions about filesystems fail if they do data compression or dedupe files - you can easily overestimate how much space something is taking.
I find it so embarrassing because every dev gets fed this culture. Or take the insane update sizes. Os or Xcode. Or the upload to the app store useing bandwidth several times the binary size. An app that used to be .5MB in objc is now in swift 15MB tesflight, 70MB production ipa and 300MB upload traffic.
That's what sustainability looks like in the eyes of a hardware supplier.
Xcode is actually only about 17 GB on disk, assuming you haven’t copied it anywhere. When it’s installed transparent APFS compression is applied to the bundle.
I know right? It’s only multiple OSs, multiple simulators, duplicated binaries for them all. Totally unreasonable for it to be 17gig! After all chrome manages to fit in only half a gig, and it has like a whole browser.
It's all there because the usability is much better if you download one thing, and then no matter what you then choose to work on, it's available.
That said, I think that code signature on Xcode does let you delete whichever sdks and simulators you want without invalidating the main package signature? (This is _very_ much an "I think" case). That would remove the "force" element you find to be a problem.
> of the download/onboarding! Which is not what I should spend most my time with Xcode anyway.
No, of the runtime use. There is no need to have a up front "what hardware are you planning to target?" "what OS versions are you planning to target?".
The several hour download when high speed internet is not available is obviously annoying, but it is not remotely "insane amounts of energy".
But on the other hand you don't get in a position where you think you've made the correct selection, and then a little bit later you go to do something and can't, so have to wait on another download, right when you were wanting to do something.
The approach of "everything in one bundle" is the correct option from a usability point of view, which is clearly what has been favored. It is also clearly and reasonably frustrating that that one download is large and on slower connections is very long. The thing is that the single download means that the install path is "set it to download; go to bed; get up and use freshly downloaded gigantic bundle" with no risk of thinking your done, but then not having everything you need. I suspect the latter would result in many more people complaining on twitter, or needing support.
I do want to be clear though that I agree the large download is annoying, and surely something could be done to reduce the size.
I assume you are referring to Xcode's on disk size on Apple Silicon because I downloaded Xcode 14.0.1 yesterday (a 7.6 GB download) which expanded to 23.01 GB on disk. This is on a MBP with an Intel chip.
That’s likely uncompressed. Try running du -h -d0 against it to see its on-disk usage. If it’s still high, you probably accidentally decompressed it and might benefit from a reinstall to fix that.
Symbols are a godsend when dealing with difficult crash reports. Have you ever tried walking through the process of analyzing one with an expert? It's a great way to level up your diagnosis skills.
Frankly, it’s not been much of an issue. I tend to be super-anal about Quality, and post-release crashes are rare as hen’s teeth; which makes the ones that do happen, more important to diagnose.
I usually get a general idea of where the crash happens, and scanning the source, has always shown me the issue.
Symbolicating a crash report is annoying but mostly mechanical (fwiw you can open them in Xcode and it will pull in all the dSYMs for you). Reading a crash report is an art.
You probably know this, but relying on your local machine and storage for this is a bad idea for production apps.
There are a ton of services that handle all the “mechanical” stuff for you (e.g. AppCenter, ex-HockeyApp), and you can just look at symbolicated crash in your browser.
For production apps I would centralize symbols, yes. But I have a strong disdain for crash reporting services, which are typically authored by people who are designing for aggregating crashes and showing pretty charts rather than really getting deep into the actual log itself. Most have pitifully less information than an actual crash log would–often missing thread state, exception codes, and memory mappings.
> I recently discovered a big cache - in terms of file number, 6000 files - of basically every edit I did in the last 6 months in VS Code - called Local History.
Isn't this a feature? I wouldn't want my editor to remove my local history without me knowing. I frequently use this local history (in Intellij), for whatever reason (it's easier than git, the project doesn't have version control, I haven't committed yet, ...)
6 months is a very long time for this though. I imagine that I would almost never want local history past a day or two. Beyond that I would look to my version control system.
I do think there should be a default expiry but I’ve absolutely went back many many months and found old versions of files in cases I didn’t have proper commits.
Yeah for sure, but this tool is really good for helping you visualize and discover what's eating up your storage, better than anything built in to any of the OSes that I use anyways.
The source is available (https://invent.kde.org/utilities/filelight), and there's a chance it'll build on macOS. I haven't tried, though (and am not the original recommender).
There should be one directory where all apps keep caches in subdirectories, managed by the operating system, evicting cached files as necessary using some LRU policy. (MacOS has well-defined cache directories, but nothing managing them.)
Likewise for system RAM. Lots of apps are using RAM to cache with no way to coordinate.
I once had a Photos library that consumed 250 Gb - for 80 Gb worth of photos. Thankfully there was some third party software that could automatically export and reimport all photos (it took hours), which shrank the library back to its actual size.
Of course not. But why would I need "every saved state of every file"? The number of times that would have saved me would be zero, because I never ended up in a case where that would have saved me. Things I need saved are saved as stashes/things put in git, and the rest are things I don't need saved.
What scenarios are there where things like this would have saved me somehow?
For example almost all electron apps keep the last two or so versions around when they auto-update. And since Electron apps are around 250 MB that's 0.5 GB of old versions per app.
I recently discovered a big cache - in terms of file number, 6000 files - of basically every edit I did in the last 6 months in VS Code - called Local History. Also, every directory you launch VS Code in will have a workspace cache associated with it, it can be 200 MB if you have C++ files in it, and VS Code will not remove it when you delete the original dir.
And so on...
I wish they would all integrate into a master clear caches system app, where you would have the opportunity of seeing them and clearing them. A bit like on Android.