I read an interesting thread on Hackernews in response to a post: “Why is OAuth still hard in 2023”. The post and comments bring up a lot of real issues with OAuth. The article ends with a pitch to use the author’s product Nango that advertises
support for supporting OAuth2 flows for 90+ APIs and justifying the existence of the product.
We don’t need 90 browsers to open 90 websites, so why is this the case with OAuth2? In a similar vain, the popular passport.js project has 538(!) modules for authenticating with various services, most of which likely use OAuth2. All of these are NPM packages.
Anyway, I’ve been wanting to write this article for a while. It’s not a direct response to the Nango article, but it’s a similar take with a different solution.
My perspective
I’ve been working on an OAuth2 server for a few years now, and last year I released an open source OAuth2 client.
Since I released the client, I’ve gotten several new features and requests that were all contributed by users of the library, a few of note are:
- Allowing
client_id
and client_secret
to be sent in request bodies instead of the Authorization
header.
- Allow ‘extra parameters’ to be sent with some OAuth2 flows. Many servers, including Auth0 require these.
- Allow users to add their own HTTP headers, for the same reason.
What these have in common is that there’s a lot of different OAuth2 servers that want things in a slightly different/specific way.
I kind of expected this. It wasn’t going to be enough to just implement OAuth2. This library will only work once people start trying it with different servers and run into mild incompatibilities that this library will have to add workarounds for.
Although I think OAuth2 is pretty well defined, the full breadth of specs and implementations makes it so that it’s not enough to (as an API developer) to just tell your users: “We use OAuth2”.
For the typical case, you might have to tell them something like this:
- We use OAuth2.
- We use the
authorization_code
flow.
- Your
client_id
is X.
- Our ‘token endpoint’ is Y.
- Our ‘authorization endpoint’ is Z.
- We require PKCE.
- Requests to the “token” endpoint require credentials to be sent in a body.
- Any custom non-standard extensions.
To some extent this is by design. The OAuth2 spec calls itself: “The OAuth 2.0 Authorization Framework”. It’s not saying it is the protocol, but rather it’s a set of really good building blocks to implement your own authentication.
But for users that want to use generic OAuth2 tooling this is not ideal. Not only because of the amount of information that needs to be shared, but also it requires users of your API to be familiar with all these terms.
A side-effect of this is that API vendors that use OAuth2 will be more likely roll their own SDKs, so they can insulate users from these implementation details. It also creates a market for products like Nango and Passport.js.
Another result is that I see many people invent their own authentication flows with JWT and refresh tokens from scratch, even though OAuth2 would be good fit. Most people only need a small part of OAuth2, but to understand which small part you need you’ll need to wade through and understand a dozen IETF RFC documents, some
of wich are still drafts.
Sidenote: OpenID Connect is another dimension on top of this. OpenID Connect builds on OAuth2 and adds many features and another set of dense technical specs that are (in my opinion) even harder to read.
OAuth2 as a framework is really good and very successful. But it’s not as good at being a generic protocol that people can write generic code for.
Solving the setup issue
There’s a nice OAuth2 feature called “OAuth 2.0 Authorization Server Metadata”, defined in RFC8414. This is a JSON document sitting at a predictable URL: https://your-server/.well-known/oauth-authorization-server
, and can tell
clients:
Truncated by Planet PHP, read more at the original (another 8946 bytes)