Hi OVH Community;
I have an issue with the OIDC setup inside a managed kubernetes cluster when I use Google as OpenID provider.
The setup was build like described in this documentation.
The ovh terraform provider and the hashicorp kubernetes provider where used to create the cluster Role Bindings.
kubelogin is used as kubectl credential plugin.
> resource "ovh_cloud_project_kube_oidc" "cluster-oidc" {
> service_name = var.ovh_public_cloud_project_id
> kube_id = ovh_cloud_project_kube.cluster.id
>
> #required field
> client_id = var.oidc_client_id
> issuer_url = var.oidc_issuer_url
>
> oidc_username_claim = "email"
> oidc_username_prefix = "oidc:"
> depends_on = [ovh_cloud_project_kube.cluster]
> }
> resource "kubernetes_cluster_role_binding" "oidc-cluster-admin" {
> metadata {
> name = "oidc-cluster-admin"
> }
> role_ref {
> api_group = "rbac.authorization.k8s.io"
> kind = "ClusterRole"
> name = "cluster-admin"
> }
> subject {
> kind = "User"
> name = "oidc:my.user@fqdn.something"
> api_group = "rbac.authorization.k8s.io"
> }
> depends_on = [ovh_cloud_project_kube_oidc.cluster-oidc]
> }
> - name: oidc
> user:
> exec:
> apiVersion: client.authentication.k8s.io/v1beta1
> args:
> - oidc-login
> - get-token
> - --oidc-issuer-url=https://accounts.google.com
> - --oidc-client-id=somestring.apps.googleusercontent.com
> - --oidc-client-secret=superseret
> - --oidc-extra-scope=email
> - --oidc-extra-scope=profile
> - --oidc-extra-scope=openid
> - -v10
> command: kubectl
> env: null
> provideClusterInfo: false
> kubectl get pods --user=oidc
> .......
> I0425 17:35:54.204892 62059 get_token.go:107] you already have a valid token until 2023-04-25 18:35:53 +0200 CEST
> I0425 17:35:54.204895 62059 get_token.go:114] writing the token to client-go
> error: You must be logged in to the server (Unauthorized)
I also tried to get some insights via the apiserver audit logs in the OVH web interface, but this is nearly impossible without filters. WOuld be nice if it is possible to download the audit logs, and grep that log.
Or may there is a possible solution that I don't know yet.
Perhaps someone has experience with OIDC and Google Identities and give me a hint.
A similar setup in a k3s cluster worked for me, but there I had direct access to the apiserver to debug.
Sure I can use keycloak, dex or authentik in between, but this is not what I initialy want.
Best Regards
Managed Kubernetes OIDC with Google OpenID
Related questions
- How to check if Kubernetes is installed?
2294
25.06.2020 08:22
- How to activate Pod Security Policies on Kubernetes cloud?
1211
11.05.2020 16:12
- Managed Kubernetes LoadBalancer is not activated
1070
30.05.2022 13:22
- What is a POD in kubernetes?
995
11.08.2020 15:45
- Loadbalancer static ip?
893
18.05.2021 16:47
- K8s volume , upload file in to it
788
05.10.2021 12:03
- Connection to outside services from a pod
788
08.02.2021 22:14
- Managed Kubernetes Service and available regions
729
05.11.2021 08:18
- Managed Kubernetes OIDC with Google OpenID
695
02.05.2023 14:00
- Managed Kubernetes with external private registry and private root certificate authorit
690
20.04.2022 12:47