Most of the development for pmtiles has happened in the last 2 years, including a maturing server implementation (http://github.com/protomaps/go-pmtiles) but some key parts are still missing like the ability to decode an archive in native mobile applications. sqlite (mbtiles) already has ~10+ years of integration into the mapping ecosystem so that still works better if you want to move tilesets around to desktop applications and mobile devices.
The situation Paul is addressing is one unique to OpenStreetMap itself, which is minute-level updates of a global scale tileset. This is a use case pmtiles is designed explicitly not to address, where using a database is a better fit.
Appreciate you weighing in directly Brandon! To be clear though, I didn't mean to say that protomaps was potentially a better solution (of course an uncompressed DB makes much more sense), simply that the developing nature of the protocol meant that it's inclusion within the "ecosystem" shouldn't be a closed case :)
Which is to say, distribution of OSM data feels like a large part of the process. Of course there are various bottlenecks / considerations around edits / writes, but in practice surely reads are the bigger vector. I wonder how many OSM-external use cases rely on "minutely" updates or need the full fidelity of the raw data source. Feels like there is a solid case for providing less frequent (hourly, daily?) official updates via official "single-file" formats that could be widely distributed to the benefit of all, and e.g. allow OSM to loosen up their hot-linking policies and ensure continued investment in the chosen protocols.
But mainly I was questioning how a somewhat proven format like SQLite, with its many benefits (interoperability, distribution, etc.), would be so easily dropped from consideration without even a test having been run. Just my thoughts of course!
There is nothing stopping the OSM Foundation from say, offering a complete SQLite (or PMTiles) tileset download on planet.openstreetmap.org, technically, legally or otherwise. Creating and archiving the tiles shown on osm.org would be at least a couple terabytes once you get to around zoom level 16.
The key distinction is "official" - the only "official" data product of OpenStreetMap are its XML and PBF data dumps at planet.openstreetmap.org. The frequently-updated tiles you see on osm.org are "quasi-official", they're created by a separate project called OpenStreetMap Carto. These tiles have a special status within the OSM ecosystem for historical reasons; by virtue of OSM being map data, it should probably show something for human eyeballs on the website.
The design goals of OSM Carto are to show feedback to map editors; the linked vector tiles project is intended as a successor, or at least complement, to OSM Carto. The consumption of the tiles by third-party sites is a side-effect tolerated by OSMF; the general consensus within OSM seems to be that a consumable tile service for third parties is outside the scope of the project.
Definitely not in a position to question the status quo, let alone the OSM community consensus. Other than to say that - even if the focus is currently on internal software / editors alone - it still feels like a massive opportunity. And one that aligns with a number of strategic goals of OSMF[1], specifically around efforts to grow the community, extend the core developer base (and more) ... by fostering a closer relationship between OSM data-derived products / the wider (non-editor) community.
The situation Paul is addressing is one unique to OpenStreetMap itself, which is minute-level updates of a global scale tileset. This is a use case pmtiles is designed explicitly not to address, where using a database is a better fit.