Kubesphere上搭建ELK日志收集系统
Kubesphere上搭建ELK日志收集系统
Elasticsearch
版本:elasticsearch:7.14.0
1)创建挂载文件elasticsearch.yml
单节点:
cluster.name: "cluster-es"
node.name: "elasticsearch-0"
network.host: "0.0.0.0"
node.master: true
node.data: true
http.port: 9200
transport.tcp.port: 9300
http.cors.allow-origin: "*"
http.cors.enabled: true
http.max_content_length: 200mb
discovery.seed_hosts: ["127.0.0.1"]
cluster.initial_master_nodes: ["elasticsearch-0"]
集群(几个节点就设置几份,更改下name就行):
cluster.name: "cluster-es"
node.name: "elasticsearch-0"
network.host: "0.0.0.0"
node.master: true
node.data: true
http.port: 9200
transport.tcp.port: 9300
http.cors.allow-origin: "*"
http.cors.enabled: true
http.max_content_length: 200mb
discovery.seed_hosts:
- 192.168.1.100:9300
- 192.168.1.101:9300
- 192.168.1.102:9300
cluster.initial_master_nodes:
- "elasticsearch-0"
- "elasticsearch-1"
- "elasticsearch-2"
2)开放端口,设置环境变量
ES_JAVA_OPTS -Xms512m -Xmx512m
3)挂载配置
kibana
版本:
kibana:7.14.0
1)开放端口5601,添加环境变量
ELASTICSEARCH_HOSTS
: http://elasticsearch-0.elasticsearch-iwy9.mp-java.svc.cluster.local:9200
I18N_LOCALE
: zh-CN
Logstash
版本:logstash:7.14.0
1)创建配置文件
logstash.conf:
input {
tcp {
port => 5044
codec => json_lines
}
}
filter {
date {
match => [ "@timestamp", "yyyy-MM-dd HH:mm:ss Z" ]
}
mutate {
remove_field => ["@version", "agent", "cloud", "host", "input", "log", "tags", "_index", "_source", "ecs", "event"]
add_field => { "app_name" => "%{[app_name]}" }
}
}
output {
elasticsearch {
hosts => ["http://10.233.42.25:9200"]
index => "server-%{+YYYY.MM.dd}"
}
}
logstash.yml:
http.host: "0.0.0.0"
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.hosts: [ "http://10.233.42.25:9200" ]
pipelines.yml:
- pipeline.id: main
path.config: "/usr/share/logstash/pipeline/logstash.conf"
2)开放端口
3)挂载配置文件