Hello,
I'm trying to deploy Jupyterhub on OVH k8s cluster but I can't make it run with default configuration.
`config.yaml`:
```yaml
# This file can update the JupyterHub Helm chart's default configuration values.
#
# For reference see the configuration reference and default values, but make
# sure to refer to the Helm chart version of interest to you!
#
# Introduction to YAML: https://www.youtube.com/watch?v=cdLNKUoMc6c
# Chart config reference: https://zero-to-jupyterhub.readthedocs.io/en/stable/resources/reference.html
# Chart default values: https://github.com/jupyterhub/zero-to-jupyterhub-k8s/blob/0883f9927b35ea2efad490a8e19dda8a4a8e12dc/jupyterhub/values.yaml
# Available chart versions: https://jupyterhub.github.io/helm-chart/
#
```
```shell
$ kubectl get node
NAME STATUS ROLES AGE VERSION
nodepool-813c1e34-xxxx-xxxx-98-node-121a16 Ready 4h v1.22.2
```
```shell
$ helm version
version.BuildInfo{Version:"v3.6.3", GitCommit:"d506314abfb5d21419df8c7e7e68012379db2354", GitTreeState:"clean", GoVersion:"go1.16.5"}
```
```bash
helm upgrade --cleanup-on-fail \
--install skl-jhub jupyterhub/jupyterhub \
--namespace skl-jhub \
--create-namespace \
--version=1.2.0 \
--values config.yaml
```
Log output :
```shell
$ kubectl logs --namespace skl-jhub hub-87f985879-btlkc
Loading /usr/local/etc/jupyterhub/secret/values.yaml
No config at /usr/local/etc/jupyterhub/existing-secret/values.yaml
[I 2021-12-14 12:23:46.895 JupyterHub app:2479] Running JupyterHub version 1.5.0
[I 2021-12-14 12:23:46.895 JupyterHub app:2509] Using Authenticator: jupyterhub.auth.DummyAuthenticator-1.5.0
[I 2021-12-14 12:23:46.895 JupyterHub app:2509] Using Spawner: kubespawner.spawner.KubeSpawner-1.1.0
[I 2021-12-14 12:23:46.896 JupyterHub app:2509] Using Proxy: jupyterhub.proxy.ConfigurableHTTPProxy-1.5.0
[E 2021-12-14 12:23:46.907 JupyterHub app:2989]
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/jupyterhub/app.py", line 2986, in launch_instance_async
await self.initialize(argv)
File "/usr/local/lib/python3.8/dist-packages/jupyterhub/app.py", line 2521, in initialize
self.init_db()
File "/usr/local/lib/python3.8/dist-packages/jupyterhub/app.py", line 1723, in init_db
dbutil.upgrade_if_needed(self.db_url, log=self.log)
File "/usr/local/lib/python3.8/dist-packages/jupyterhub/dbutil.py", line 112, in upgrade_if_needed
orm.check_db_revision(engine)
File "/usr/local/lib/python3.8/dist-packages/jupyterhub/orm.py", line 771, in check_db_revision
current_table_names = set(inspect(engine).get_table_names())
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/inspection.py", line 64, in inspect
ret = reg(subject)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/reflection.py", line 182, in _engine_insp
return Inspector._construct(Inspector._init_engine, bind)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/reflection.py", line 117, in _construct
init(self, bind)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/reflection.py", line 128, in _init_engine
engine.connect().close()
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/base.py", line 3165, in connect
return self._connection_cls(self, close_with_result=close_with_result)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/base.py", line 96, in __init__
else engine.raw_connection()
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/base.py", line 3244, in raw_connection
return self._wrap_pool_connect(self.pool.connect, _connection)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/base.py", line 3214, in _wrap_pool_connect
Connection._handle_dbapi_exception_noconnection(
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/base.py", line 2068, in _handle_dbapi_exception_noconnection
util.raise_(
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/util/compat.py", line 207, in raise_
raise exception
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/base.py", line 3211, in _wrap_pool_connect
return fn()
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/pool/base.py", line 307, in connect
return _ConnectionFairy._checkout(self)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/pool/base.py", line 767, in _checkout
fairy = _ConnectionRecord.checkout(pool)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/pool/base.py", line 425, in checkout
rec = pool._do_get()
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/pool/impl.py", line 256, in _do_get
return self._create_connection()
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/pool/base.py", line 253, in _create_connection
return _ConnectionRecord(self)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/pool/base.py", line 368, in __init__
self.__connect()
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/pool/base.py", line 611, in __connect
pool.logger.debug("Error on connect(): %s", e)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
compat.raise_(
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/util/compat.py", line 207, in raise_
raise exception
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/pool/base.py", line 605, in __connect
connection = pool._invoke_creator(self)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/create.py", line 578, in connect
return dialect.connect(*cargs, **cparams)
File "/usr/local/lib/python3.8/dist-packages/sqlalchemy/engine/default.py", line 584, in connect
return self.dbapi.connect(*cargs, **cparams)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
(Background on this error at: http://sqlalche.me/e/14/e3q8)
```
**Edit :** Add PVC and PV description below
```shell
$ kubectl describe pv
Name: ovh-managed-kubernetes-8tk3qj-pvc-bacb4137-67a9-437f-a047-f561751920c9
Labels:
Annotations: pv.kubernetes.io/provisioned-by: cinder.csi.openstack.org
Finalizers: [kubernetes.io/pv-protection external-attacher/cinder-csi-openstack-org]
StorageClass: csi-cinder-high-speed
Status: Bound
Claim: skl-jhub/hub-db-dir
Reclaim Policy: Delete
Access Modes: RWO
VolumeMode: Filesystem
Capacity: 1Gi
Node Affinity:
Message:
Source:
Type: CSI (a Container Storage Interface (CSI) volume source)
Driver: cinder.csi.openstack.org
FSType:
VolumeHandle: a31dd08e-80a8-4ca8-91ab-7be7295b274e
ReadOnly: false
VolumeAttributes: storage.kubernetes.io/csiProvisionerIdentity=1639470543810-8081-cinder.csi.openstack.org
Events:
```
```shell
$ kubectl describe pvc
Name: hub-db-dir
Namespace: skl-jhub
StorageClass: csi-cinder-high-speed
Status: Bound
Volume: ovh-managed-kubernetes-8tk3qj-pvc-bacb4137-67a9-437f-a047-f561751920c9
Labels: app=jupyterhub
app.kubernetes.io/managed-by=Helm
chart=jupyterhub-1.2.0
component=hub
heritage=Helm
release=skl-jhub
Annotations: meta.helm.sh/release-name: skl-jhub
meta.helm.sh/release-namespace: skl-jhub
pv.kubernetes.io/bind-completed: yes
pv.kubernetes.io/bound-by-controller: yes
volume.beta.kubernetes.io/storage-provisioner: cinder.csi.openstack.org
Finalizers: [kubernetes.io/pvc-protection]
Capacity: 1Gi
Access Modes: RWO
VolumeMode: Filesystem
Used By: hub-55c56748d6-dfljt
Events:
Type Reason Age From Message
---- ------ ---- ---- -------
Normal ExternalProvisioning 26m persistentvolume-controller waiting for a volume to be created, either by external provisioner "cinder.csi.openstack.org" or manually created by system administrator
Normal Provisioning 26m cinder.csi.openstack.org_csi-cinder-controllerplugin-0_f7428e10-cc91-4341-92c5-eaae61679ebc External provisioner is provisioning volume for claim "skl-jhub/hub-db-dir"
Normal ProvisioningSucceeded 26m cinder.csi.openstack.org_csi-cinder-controllerplugin-0_f7428e10-cc91-4341-92c5-eaae61679ebc Successfully provisioned volume ovh-managed-kubernetes-8tk3qj-pvc-bacb4137-67a9-437f-a047-f561751920c9
```
This is definitely a permissions issue since it works when I set the pod user as root but I cannot do that in prod. The default userId/groupId for Jupyterhib is `1000` so the mounted PV should set the rights accordingly.
Any idea on how to achieve that ?
PV are not mounted with correct permissions inside pods
Bienvenue sur OVHcloud Community
Posez des questions, recherchez des informations, publiez du contenu et interagissez avec d'autres membres d'OVHcloud Community.
question
PV are not mounted with correct permissions inside pods
Par
Avis positifs (0)
2084 Vues
Sujets apparentés
- Kubectl - TLS handshake timeout
5632
18.03.2019 09:07
- Docker registry private
5385
13.10.2016 07:26
- [Auto TLS sur k8s] ClusterIssuer cert-manager pour OVH
4453
18.11.2017 22:45
- Ingress on ports 80/443 for cluster
4203
05.12.2018 14:44
- Kubernetes Metrics
3187
18.11.2019 10:49
- A-t-on une idée des futurs tarifs?
3039
03.04.2019 19:18
- OVH Managed Kubernetes security patches
2915
06.12.2018 10:23
- Kubernetes ingress port 80 & 443
2784
31.12.2018 11:48
- Node Autoscaling / K8s dans une autre région: est-ce que K8s vaut le coup ?
2603
23.01.2020 16:40
- Datastore commun ?
2596
12.10.2016 12:15