CORS in SAFE Browsers

So, right now, electron/chromium etc have CORS enabled. Which means the SAFE browser does too.

What is CORS?.

It’s a method of limiting what domains scripts can fetch data from on the clearnet.

Recently @latch came upon some CORS issues fetching content via the native browser fetch API.
To get around it, we’ve basically a branch which is permissive in terms of CORS. The pseudo-server we use for managing HTTP responses will return with Access-Control-Allow-Origin: *. So any safe site can access the data.

This is in line with the fact that, currently, any safe site could access that same data via the safe.fetch api anyway.

So it seems no harm done?

That’s really my question here. OR something I’d like to properly consider. Do we want/need CORS in a SAFE Browser?

As above. CORS is ostensibly about limiting data access to certain domains (afaik). But on SAFE anything public should be just that. So in that sense, CORS goes against our SAFE ideals.

Also, in terms of limiting data access, we have other data types. So theoretically if we want to limit data access to certain sites/apps, we’re free to do that.

Does anyone see any issue with the above? Anything I’m missing here?

If not, it’s a tiny wee fix to get merged into the browser.

If we do want something CORS like, well we should probably be looking to implement that functionality for safe.fetch too…

Anyway, would be grand to hear some thoughts on that.

(@latch, @happybeing, @bzee, @ravinderjangra, @Shane tagging yous as you may well have opinions here).


If I’m being honest, I’ve been against the idea of supporting the native browser fetch from the very beginning.

This isn’t the clearnet, we shouldn’t be providing interop with clearnet resources. As soon as we start to go down that path, complexities like these pop up where we have to make compromises or perform extra work to support libraries which weren’t built to work on SAFE. If we are going to support fetch, IMO it should be re-implemented to be a simple wrapper around safe.fetch.

with that being said: CORS has no place in the SAFE network. Anything public is accessible via anything else public and I don’t really see any additional risks with disabling CORS protection. It’s possible for there to be no NRS involved in a webpage to begin with (if you’re directly connecting to a FilesContainer), so the idea of limiting requests by NRS makes very little sense.


Thanks, for being honest! It’s important. And i can’t say I disagree.

Indeed. This idea has been punted around a bit. But it just comes down to how best (and if we should) provide a HTTP response API. Avoiding having to run a server would be ideal IMO. But I would also probably not want that in the core APIs.

For adoption some simple HTTP generating api is probably key. (Getting video or music players to work quickly eg).

also good to know your thoughts on CORS :+1: :slight_smile:


I’m really familiar with the domain of supporting video and music players, my previous full time job involved building a custom live video delivery platform aimed at minimal latency.

There’s a few aspects we’re going to need to support out of the box for them to work, and I figured you would be more familiar than I am with the inner workings of the browser / how our fetch API is exposed / how that interops with the SAFE network behind the scenes.

  1. Being able to SEEK in files is very important, most files on the clearnet are stored as a single large mp4 / avi / mov. In the SAFE network these will be stored in chunks, how will the SAFE browser / network handle Range request headers and Accept-Ranges / Content-Range response headers?
  2. If you don’t support SEEKing, is there any guarantee about the order of chunks which make up a file? For instance, to begin playing a large MP4, you’re going to need the chunks to be delivered in the right order (first chunk first which contains codec information, then going forward, chunks containing IFRAMES have to arrive before chunks containing PFRAMES / BFRAMES otherwise you end up with distorted video).

If the order of chunks when downloading a video is essentially randomised, you would have to wait for the entire video to download before being able to watch it. Alternatively, you could deliver a complete m3u8 manifest instead of an MP4 and just play the video files in the manifest in order, but converting an MP4 to an HLS with ffmpeg is outside of the scope of most laypeople’s abilities.


I think the key point is CORS doesn’t work in SAFE for the reasons mentioned, so there’s no point supporting it.

The question for me is do we want or need the kind of protection that CORS is designed to provide, and if so, how can that be done?

CORS is, if I understand it, a way for a web service to limit what can be accessed from within its service. So, for example to say that code on this website can only do certain operations on other websites, or none at all.

Please correct me if I’m missing stuff or just plain wrong.

If that’s the case, what is the purpose? And does that apply to SAFE?

I’m not sure of all the reasons why you would want to use CORS, but reducing security vulnerabilities is part of it - eliminating cross site scripting attacks is I think an example. Preventing your website form being co-opted into DDoS attacks on other sites is probably another, and so on.

So, are there similar kinds of attack which would apply to SAFE? Attacks on servers obviously not. But could clients be co-opted in a similar way to servers, if say the code from one SAFE app has complete freedom to access everything? I’m not sure if that’s a problem or if there’s anything we could do about it if it was but…

Maybe a user would want to know if the app being used is accessing stuff that it isn’t expected to?

I think these are post MVP issues, so not worth too much effort at this stage. So while I don’t claim to understand CORS or the kind of problem we might want to mitigate on SAFE I’m fairly confident that supporting CORS does not make sense in SAFE. And I don’t think we should worry much about rogue apps just yet, at least not until MVP.


So we’ve had this functionality via the HTTP request/response server before. Basically for files, there are (have been… not sure they’re exposed in the new API just yet) apis for partial requests. So we had that set up nicely previous. I’d imagine we can get that going again without too much bother. (More a Q of defining how we want our APIs to look first and ensuring we’re exposing something sane).

Good to know you’ve experience here. I definitely do not know the inner workings of video libs/clients so great to have someone to keep us straight here :+1:

Yeh, I agree. Good Qs there.

Aye could be.

It’s less a Q of ‘supporting CORS’, as that’s baked into chromium. It perhaps could be rephrased as ‘shall we negate CORS for now’, and sounds like we’re leaning towards ‘yes’.

And then we can consider the Qs you’re raising and what we may need to tackle there. :+1:


I’ve read all non-code level documentation I’ve found on SAFE. I don’t yet have enough experience working with SAFE to have a design preference. As that changes I’ll have more to add.

As SAFE is a modern rethinking of secure and private abstracted networking, my general instinct would be to implement an ideal native API and additionally provide a compatibility layer using the native API to extend common interfaces such as the Fetch API.

My rationale is that affording drop-in compatibility would both ease on-boarding and support tech stacks lacking native API client libs. Being able to deploy a clearnet web app to SAFE with few if any modifications would be a pleasant first SAFE developer experience.

What are the primary arguments against providing such a compatibility layer?


I don’t think there’s any argument against such a layer. It becomes a question of who/when/how and then also what does that layer need in terms of APIs / data structs.

I view that as somewhat orthogonal to the CORS q here. So perhaps something to fire in to another thread about, speccing out how such a layer might look.


FYI safenetworkjs was designed with this capability in mind, making it easy to add support for new REST protocols and providing an implementation of LDP on which a Solid compatibility layer was based.

1 Like

Is there not a way to modify how the browser actually sees a subset of data from a FileContainer not to even treat this as a different cross site requestion in the first place? This would essentially treat that FileContainer as a site index rather than the actual XOR of individual files? Or am I missunderstanding?

If you put your files container under an NRS site, you effectively do that, so that sorts CORS.

The issue was with pointing direct at XorUrls (which are each effectively their own domain).