英文:
Elasticsearch Setup Custom Index and Write Issue
问题
使用默认设置,通过Filebeat Shipper获取Elasticsearch日志。所有自定义索引设置都配置在/etc/filebeats/filebeats.yml
文件中。这是我的配置文件:
output.elasticsearch:
# 要连接的主机数组。
hosts: ["主机IP:9200"]
protocol: "https"
index: "samba-%{[agent.hostname]}-%{[agent.version]}-%{+dd.MM.yyyy}"
# 认证凭据 - API密钥或用户名/密码。
username: "elastic"
password: "密码"
ssl:
enabled: true
certificate_authorities:
- |
-----BEGIN CERTIFICATE-----
XXX
-----END CERTIFICATE-----
setup.template:
name: "samba"
pattern: "samba-%{[agent.version]}"
overwrite: true
setup.ilm.enabled: false
当运行filebeat setup命令时,会抛出"未找到匹配的数据流索引模板 [samba]"
异常,尽管这个自定义索引模板已经在ELK上创建。在启动filebeat服务后,所有日志都被收集到默认索引(.ds-filebeat-8.6.2-2023.03.09-000001)中。
更新:
简而言之,这是API调用的输出:
{
"index_templates": [
{
"name": "samba",
"index_template": {
"index_patterns": [
"samba-8.6.2"
],
"template": {
"settings": {
"index": {
"mapping": {
"total_fields": {
"limit": "10000"
}
},
"refresh_interval": "5s",
"number_of_shards": "1",
"max_docvalue_fields_search": "200",
"query": {
"default_field": [
// 其他字段。
"fields.*"
]
}
}
},
"mappings": {
"_meta": {
"beat": "filebeat",
"version": "8.6.2"
}
// 使用VS Code IDE删除了约30,000行。
}
},
"composed_of": [],
"priority": 150,
"data_stream": {
"hidden": false,
"allow_custom_routing": false
}
}
}
]
}
}
英文:
Elasticsearch get logs via filebeats shipper default settings. All custom index settings were configured on /etc/filebeats/filebeats.yml
file. This is my configuration file:
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["host-ip:9200"]
protocol: "https"
index: "samba-%{[agent.hostname]}-%{[agent.version]}-%{+dd.MM.yyyy}"
# Authentication credentials - either API key or username/password.
username: "elastic"
password: "password"
ssl:
enabled: true
certificate_authorities:
- |
-----BEGIN CERTIFICATE-----
XXX
-----END CERTIFICATE-----
setup.template:
name: "samba"
pattern: "samba-%{[agent.version]}"
overwrite: true
setup.ilm.enabled: false
When the filebeat setup command was run, "no matching index template found for data stream [samba]"
exception was thrown, although this custom index template was created on ELK. After start the filebeat service, all logs were collected on default index (.ds-filebeat-8.6.2-2023.03.09-000001).
UPDATE:
Briefly, this is api call output:
{
"index_templates": [
{
"name": "samba",
"index_template": {
"index_patterns": [
"samba-8.6.2"
],
"template": {
"settings": {
"index": {
"mapping": {
"total_fields": {
"limit": "10000"
}
},
"refresh_interval": "5s",
"number_of_shards": "1",
"max_docvalue_fields_search": "200",
"query": {
"default_field": [
// other fileds.
"fields.*"
]
}
}
},
"mappings": {
"_meta": {
"beat": "filebeat",
"version": "8.6.2"
}
// about 30.000 line is removed by use vscode ide.
}
},
"composed_of": [],
"priority": 150,
"data_stream": {
"hidden": false,
"allow_custom_routing": false
}
}
}
]
}
答案1
得分: 1
错误消息显示“数据流[samba]找不到匹配的索引模板”,确实您的模式是samba-%{[agent.version]}
。
解决方案
将模式更改为samba*
因此,您的文件应如下所示
output.elasticsearch:
# 要连接的主机数组。
hosts: ["主机IP:9200"]
protocol: "https"
index: "samba-%{[agent.hostname]}-%{[agent.version]}-%{+dd.MM.yyyy}"
# 认证凭据 - 可以是API密钥或用户名/密码。
username: "elastic"
password: "密码"
ssl:
enabled: true
certificate_authorities:
- |
-----BEGIN CERTIFICATE-----
XXX
-----END CERTIFICATE-----
setup.template:
name: "samba"
pattern: "samba*"
overwrite: true
setup.ilm.enabled: false
英文:
Tldr;
The error says no matching index template found for data stream [samba]
, and indeed the pattern you have is samba-%{[agent.version]}
Solution
Change the pattern to samba*
So your file should look like
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["host-ip:9200"]
protocol: "https"
index: "samba-%{[agent.hostname]}-%{[agent.version]}-%{+dd.MM.yyyy}"
# Authentication credentials - either API key or username/password.
username: "elastic"
password: "password"
ssl:
enabled: true
certificate_authorities:
- |
-----BEGIN CERTIFICATE-----
XXX
-----END CERTIFICATE-----
setup.template:
name: "samba"
pattern: "samba*"
overwrite: true
setup.ilm.enabled: false
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论