Hi OVH Community,
I tried to follow the docs and setup OIDC of a Managed Kubernetes cluster.
Therefore I used Terraform like suggested, and it created the cluster with OIDC configuration like suggested but adapted to the google OpenID token claims.
> resource "ovh_cloud_project_kube_oidc" "cluster-oidc" {
> service_name = var.ovh_public_cloud_project_id
> kube_id = ovh_cloud_project_kube.cluster.id
>
> #required field
> client_id = var.oidc_client_id
> issuer_url = var.oidc_issuer_url
>
> oidc_username_claim = "email"
> oidc_username_prefix = "oidc:"
> depends_on = ovh_cloud_project_kube.cluster]
> }
I also created the clusterRoleBindings with hashicorp kubernetes provider.
> resource "kubernetes_cluster_role_binding" "oidc-cluster-admin" {
> metadata {
> name = "oidc-cluster-admin"
> }
> role_ref {
> api_group = "rbac.authorization.k8s.io"
> kind = "ClusterRole"
> name = "cluster-admin"
> }
> subject {
> kind = "User"
> name = "oidc:some.email@fqdn.com"
> api_group = "rbac.authorization.k8s.io"
> }
> depends_on = [ovh_cloud_project_kube_oidc.cluster-oidc]
> }
On kubectl side I used [kubelogin and configured the user like this.
> - name: oidc
> user:
> exec:
> apiVersion: client.authentication.k8s.io/v1beta1
> args:
> - oidc-login
> - get-token
> - --oidc-issuer-url=https://accounts.google.com
> - --oidc-client-id=
> - --oidc-client-secret=
> - --oidc-extra-scope=email
> - --oidc-extra-scope=profile
> - --oidc-extra-scope=openid
> - -v10
> command: kubectl
> env: null
> provideClusterInfo: false
I debugged this a bit, but the OVH Managed Kubernetes apiserver audit log output is nice, but really painfull to use if you search for issues, because there are no filters or other usefull tools. (Yes this could be seen as a feature request)
At kubectl side it seems not to work because of the missing groups claim, which is not provided by Google OpenID Connect tokens.
> I0425 09:26:59.167863 42551 get_token.go:107] you already have a valid token until 2023-04-25 10:26:58 +0200 CEST
> I0425 09:26:59.167869 42551 get_token.go:114] writing the token to client-go
> E0425 09:26:59.169871 42291 memcache.go:265] couldn't get current server API group list: the server has asked for the client to provide credentials
> error: You must be logged in to the server (the server has asked for the client to provide credentials)
There are other projects which also have issues because of this behavor.
Is there anyone who got this to work with Google OpenID Connect?
If so, can you provide more information please?
My goal is that I can use our companies Google workspaces account to authenticate.
If it is without groups it is okay for me at the moment.
Thank you. for your help;
Managed Kubernetes OIDC with Google OpenID
Related questions
- How to check if Kubernetes is installed?
2297
25.06.2020 08:22
- How to activate Pod Security Policies on Kubernetes cloud?
1212
11.05.2020 16:12
- Managed Kubernetes LoadBalancer is not activated
1073
30.05.2022 13:22
- What is a POD in kubernetes?
995
11.08.2020 15:45
- Loadbalancer static ip?
893
18.05.2021 16:47
- K8s volume , upload file in to it
788
05.10.2021 12:03
- Connection to outside services from a pod
788
08.02.2021 22:14
- Managed Kubernetes Service and available regions
729
05.11.2021 08:18
- Managed Kubernetes with external private registry and private root certificate authorit
693
20.04.2022 12:47