[报错]记一次二进制部署k8s时 kubelet一直报错
二进制部署k8s时,kubelet一直报错Feb 24 14:15:55 k8s-master01 kubelet[4257]: goroutine 743 [chan receive]:Feb 24 14:15:55 k8s-master01 kubelet[4257]: k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/watch.(*Broad
·
二进制部署k8s时,kubelet一直报错
Feb 24 14:15:55 k8s-master01 kubelet[4257]: goroutine 743 [chan receive]:
Feb 24 14:15:55 k8s-master01 kubelet[4257]: k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/watch.(*Broadcaster).loop(0xc000a2cfc0)
Feb 24 14:15:55 k8s-master01 kubelet[4257]: /workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/watch/mux.go:2
Feb 24 14:15:55 k8s-master01 kubelet[4257]: created by k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/watch.NewBroadcaster
Feb 24 14:15:55 k8s-master01 kubelet[4257]: /workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/watch/mux.go:7
Feb 24 14:15:55 k8s-master01 kubelet[4257]: goroutine 744 [chan receive]:
Feb 24 14:15:55 k8s-master01 kubelet[4257]: k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record.(*eventBroadcasterImpl).StartEventWatcher.func1(0x4f3d7c0, 0xc000f7
Feb 24 14:15:55 k8s-master01 kubelet[4257]: /workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record/event.go
Feb 24 14:15:55 k8s-master01 kubelet[4257]: created by k8s.io/kubernetes/vendor/k8s.io/client-go/tools/record.(*eventBroadcasterImpl).StartEventWatcher
上网找了很多资料都不行,最后才想起来去看kubelet的日志文件
/opt/kubernetes/logs/kubelet.xxxxx 看到了这个报错
failed to run Kubelet: misconfiguration: kubelet cgroup driver: "cgroupfs" is different from docker cgroup driver: "systemd
原来是docker的问题
[root@k8s-master01 logs]# docker info|grep Cgroup
Cgroup Driver: systemd
Cgroup Version: 1
解决办法
[root@k8s-master01 logs]# cat /etc/docker/daemon.json
{
"exec-opts": ["native.cgroupdriver=cgroupfs"],
"registry-mirrors": ["http://hub-mirror.c.163.com"]
}
[root@k8s-master01 logs]# systemctl restart docker
[root@k8s-master01 logs]# systemctl restart kubelet
这下就正常了
更多推荐
已为社区贡献1条内容
所有评论(0)