Questions tagged [filebeat]

0

votes
0

answer
4

Views

Configure filebeat to control how often logs are read and to ignore old logs

I have a prospectors setup in my filebeat.yml roughly as follows: filebeat.propectors: - type: log paths: - /tmp/log/typeA*.log pipeline: 'pipelineA' fields_under_root: true fields: logtype: TYPEA - type: log paths: - /tmp/log/typeB*.log pipeline: 'pipelineB' fields_under_root: true fields: logtype:...
Timothy Clotworthy
1

votes
0

answer
148

Views

Do I use Filebeat, Ingest or Pipelines to get rid of Logstash in my ELK stack?

I'm fairly new to filebeat, ingest, pipelines in ElasticSearch and not sure how they relate. In my old environments we had ELK with some custom grok patterns in a directory on the logstash-shipper to parse java stacktraces properly. The logstash indexer would later put the logs in ES. How do I do th...
Dennis
1

votes
1

answer
479

Views

ObjectMapper - add new value to yaml file

Below is my yaml file. Requirement is to add new line '2.log' under 'paths'. Now i am reading yaml file as Map. My code: ObjectMapper mapper = new ObjectMapper(new YAMLFactory()); Map obj = mapper .readValue(new File('filebeat.yml'), Map.class); obj.get('filebeat.prospectors').get(0).get('paths'); /...
Forece85
1

votes
0

answer
81

Views

Issue with filebeat kafka output lz4 compression

I am trying to setup filebeat (6.2) with kafka output using lz4 compression, but always getting error as ERR Invalid kafka configuration: kafka: invalid configuration (lz4 compression requires Version >= V0_10_0_0) My output configuration is: output.kafka: hosts: ['kafka1:10092','kafka2:10092','kafk...
Prabin Meitei
1

votes
0

answer
247

Views

Extract Filebeat prospector path regexp match to field

I wonder if there's anyway to extract regexp match of Filebeat prospector path to a field, for ex. something like: filebeat.prospectors: - type: log enabled: true paths: - /var/logs/apps/[(a-z)]/*.log json.keys_under_root: true json.add_error_key: true json.message_key: log fields: log_topic: 'app-$...
Idan Gozlan
1

votes
0

answer
125

Views

Filebeat message ordering during aggressive log rotation

We log a ton of data in the following format: {'record_id': '123abc', 'version': 1, 'status': 'old_status', ...} {'record_id': '123abc', 'version': 2, 'status': 'new_status', ...} We then use Filebeat to ship these logs off the server into a Kafka cluster. With smaller volumes of log data, this work...
foxygen
1

votes
0

answer
31

Views

filebeat can't get log written at php by “fwrite”

My filebeat can't get the log file that written on php by 'fwrite' Here is my php coding: $date = date('Y-M-d'); $log_path = JSON_LOG_PATH.'_'.$date.'_'.$eventid.'.log'; if (!is_file($log_path)) { # code... if( ($jsonLog=fopen ($log_path,'w')) === FALSE){ return false; } $json_string = json_encode($...
vincent wang
1

votes
1

answer
509

Views

Filebeat not pushing logs to Elasticsearch

I am new to docker and all this logging stuff so maybe I'm making a stuipd mistake so thanks for helping in advance. I have ELK running a a docker container (6.2.2) via Dockerfile line: FROM sebp/elk:latest In a separate container I am installing and running Filebeat via the folling Dockerfile line...
Flash Death
1

votes
1

answer
263

Views

Filebeat for gz files

Does filebeat support ingesting from gz files? If I unzip the files before ingesting, they grow in size very big. I did a search on the Elasticsearch forum, but could not find anything useful.
Ijaz Ahmad Khan
1

votes
1

answer
736

Views

Filebeat - Failed to publish events caused by: read tcp x.x.x.x:36196->x.x.x.x:5045: i/o timeout

Hi i'm running into a problem while sending logs via filebeat to logstash: In short - Can't see logs in kibana - when tailing the filebeat log I see a lot of these: ERROR logstash/async.go:235 Failed to publish events caused by: read tcp x.x.x.x:36246->y.y,y.y:5045: i/o timeout (while y.y,y.y is...
Tomer Lev
1

votes
0

answer
88

Views

using filebeat with the logs transferred via rsync

If the logs are transferred to one central server using rsync, and filebeat is used to transfer these logs to elasticsearch; Does the filebeat works as expected or does it attempts to send the logline even though the complete logline may not have received? I am not sure how rsync works internally. Y...
hello world
1

votes
1

answer
1.4k

Views

Docker-compose filebeat connection issue to logstash

I am running logstash and filebeat inside separate docker-compose.yml. But filebeat cannot connect to logstash. I can properly telnet into logstash telnet 127.0.0.1 5044 after I wait for the logstash pipelines to start. Filebeat cannot create a connection. I get this error. ERROR pipeline/output.go...
Joey Lee
1

votes
1

answer
262

Views

Transfer logs from kafka to elasticsearch

I am looking for the light-weight log shipper which can directly transfer my logs to elasticsearch from kafka. Out of Filebeat, Logagent, Logstash(but i need light weighted) which among them or others can suites my use-case the best?
vaibhav goel
1

votes
0

answer
331

Views

Filebeat Invalid Frame Type Error

I have ELK stack running on AWS and i wanted to get logs from my windows PC as a filebeat stream so i installed filebeat in that windows host when i run my logstash configuration i get the fallowing error . [2018-05-24T06:34:18,140][INFO ][org.logstash.beats.BeatsHandler] Exception: org.logstash.be...
1

votes
1

answer
294

Views

Plugin not Working with FileBeat in ElasticSearch

Blockquote2018-05-31T16:51:02.494+0530 ERROR fileset/factory.go:93 Error loading pipeline: Error loading pipeline for fileset apache2/access: This module requires the following Elasticsearch plugins: ingest-user-agent, ingest-geoip. You can install them by running the following commands on all t...
Harshit Yadav
1

votes
0

answer
208

Views

Getting Docker logs in filebeat

I am trying to get docker logs from filebeat filebeat.prospectors: - type: log paths: - /var/lib/mesos/slave/slaves/*/frameworks/*/executors/*/*/*/std* - /var/lib/mesos/slave/slaves/*/frameworks/*/executors/*/*/*/*.log json.message_key: log json.keys_under_root: true processors: - add_docker_metadat...
Madhu
1

votes
1

answer
235

Views

ELK in Docker and Beats in a machine in a different network

I was trying to configure ELK in docker containers in my private network and A Linux box with beats in another private network. I was trying to generate SSL cert for Filebeat to verify the identity of ELK Server. I tried using the public IP of the container host by forwarding the container port, but...
krishna chandu
1

votes
1

answer
121

Views

Filebeat holds the log file even after it has finished reading

I have a scenario where i have to move the logs files to another folder using a tool robocopy. While doing this, it throws an error saying Permission denied. Using resourcemonitor i have verified that Filebeat holds the logfile even after it has reached EOF. Due to this reason, i'm unable to move th...
Manasa
1

votes
0

answer
91

Views

Winlogbeat - not streaming logs

I am trying to stream my active directory logs ('Active Directory Web Services', 'Directory Services') to a Winlogbeat but it is not working. This is my winlogbeat.yml file winlogbeat.event_logs: - name: Application ignore_older: 6h - name: Security ignore_older: 6h - name: System ignore_older: 6h...
yserk
1

votes
0

answer
162

Views

filebeat/libbeat compiler on gccgo for solaris/sparc

I am driving to compile the project filebeat going for solar / SPARC environments. Officially, Elastic tools are not supported by Solaris. Neamoins, there is circumventenement to this problem / compile via gccgo. I actually prepare my gccgo compiler. I can compile a small project (with a single file...
1

votes
0

answer
91

Views

filebeat.yml.rpmnew or filebeat.yml

I installed recently filebeat, and I would like to edit the yml file to specify this input: filebeat.prospectors: - input_type: log paths: - /path/*.xml # scan_frequency: 60s document_type: message multiline.pattern: '^
C.med
1

votes
1

answer
125

Views

ELK stack custom fields

I am quite new to ELK stack and trying to add custom fields in filebeats. I have siebel error codes present in message field, i am trying to create a custom field for error code to be displayed as a separate field in kibana dashboard. Below is the filebeat.yml file config and the fields are not dis...
parinita vinod
1

votes
1

answer
167

Views

Get fields of log in Grok Debugger Kibana

I have a log for one application in my apache server. In Kibana I have this information in message field: INFO [20 jul 2018 09:25:21] PoolJDBC - [Pool Properties] NumConnections: 50 | NumBusyConnections: 2 | NumIdleConnections: 48 I need to get this Fields: Date / Time: 20 jul 2018 09:25:21 NumConn...
Eric
1

votes
1

answer
170

Views

Graylog 2 Collector-Sidecar Configuration doesn't merge Multiline Messages correctly

I'm using Graylog 2 with the Collector Sidecar to collect logfiles from several remote machines. Those logfiles contain Java Stacktraces and Graylog lists every line of these as a seperate message. I tried using the 'Enable Multiline' option in the collector configuration and gave it a correct regul...
Goppi
1

votes
0

answer
356

Views

Logstash - Filebeat don't parse every line of an IIS Log File

I am trying to parse IIS logs from a log file on the server, the grok filters are not the problem because i tried without it. Logstash parse only half of the lines in the log file, for example : The IIS Log file has events from 1:00 AM to 12:00 PM, but logstash only parses from 3:00 AM (for example)...
lina ab
1

votes
0

answer
115

Views

Logstash nginx filter doesn't apply to half of rows

Using filebeat to push nginx logs to logstash and then to elasticsearch. Logstash filter: filter { if [fileset][module] == 'nginx' { if [fileset][name] == 'access' { grok { match => { 'message' => ['%{IPORHOST:[nginx][access][remote_ip]} - %{DATA:[nginx][access][user_name]} \[%{HTTPDATE:[nginx][acc...
Darker
1

votes
0

answer
272

Views

Filebeat sent Logs to Logstash thought nginx proxy

I am trying to make Filbeat sending logs to Logstash using docker containers. The problem is that I have an nginx proxy in between and Filbeat-Logstash communication is not based on HTTPS. What is the solutions to make it working? I was trying to make nginx able to process tcp streams configuring it...
dventi3
1

votes
1

answer
141

Views

Getting docker log output into files on mac and linux

I have a set of containers orchestrated via docker compose. Some of the containers write to a log file inside the container and I use volumes to make those available on the host. However some containers are only logging via stdout/stderr. Te issue I'm having is I cant figure out how to get that i...
gunygoogoo
1

votes
0

answer
22

Views

Read files owned by root from a container

I have a Docker container (call it A) which creates log files on the host. Since the user running in the container is root the logs are also owned by root. Now I want a Filebeat container to read those logs. I can think of the following options: Rebuild container A and make sure its user is non-roo...
Johnny
1

votes
1

answer
67

Views

How to create multiple indexes based on conduction in logstash

Trying to create multiple indexes for elasticsearch in logstash. But my 'if conduction' is not creating any single index, without if conduction it is working fine. But if I'm using input as file and in logstash without using filebeat then it is working as per my expectation. Can anyone help me for r...
Vinit Jordan
1

votes
0

answer
117

Views

Filebeat - how control level nested json object parsing - decode_json_fields

how can I control level of decode_json_fields ? max_depth seems not help in my case. goal: parsing '/var/lib/docker/containers//.log' but controlling max json depth (not to generate hundreds of nested fields in elasticsearch index) name: 'host-01' queue: mem: events: 16384 # batch of events to the o...
AZ-
1

votes
0

answer
15

Views

Put docker container name in docker logs

My overall problem is that I am using filebeat running on the host to push logs to ES/Kibana, and the name of the container is not a field (I do have container id). How can I get the container name added to the details that are pushed to filebeat? I attempted to add the container name to the logs by...
soandos
1

votes
0

answer
19

Views

How to add a tag when messages is multiline in Logstash

I use Filebeat6x to ship my logs to logstash. Some of my logs may be a multiline thats why I use Filebeat to Manage multiline messages Now I want to add filter in logstash to do something like if the message is multiline then add tag. If the parsing of those multilines was from logstash I will use...
airdata
1

votes
0

answer
30

Views

Filebeat doesn't send logs to logstack on ELK docker stack

I have ELK stack installed on dockers (each sits on different container, over the same network, and they use the official elk images). This is how i configured the elk: 1. sudo docker network create somenetwork 2. sudo docker pull elasticsearch:6.6.1 sudo docker run -dit --name elasticsearch -h ela...
pinokyo
1

votes
0

answer
23

Views

ELK Searh not parsing the logs

I have an ELK Search running in my environment (CentOS 7). The whole process seems to occur correctly, however, the logs sent by filebeat are not being parsed by logstash. # logstash input input { beats { host => ['LogstashIP'] port => 5044 } } # httpd filter filter { grok { match => { 'message' =>...
euduzz
1

votes
0

answer
39

Views

Why is Kibana unable to create index pattern? (ELBK)

I am trying to setup filebeats to log from our spring applications to logstash.. I found a good tutorial that has exactly what I need to test locally.. http://www.andrew-programming.com/2018/09/18/integrate-springboot-application-with-elk-and-filebeat/ But when I go to create an index pattern I get...
Mick O'Gorman
1

votes
0

answer
22

Views

DataType Conversion from string to date & from string to ip in convert processor

I want to convert one of the field from string to date format. and secondly from string to ip datatype using pipeline file created. I am trying the same with the convert processor available but it will throw error as 'ip is not supported' and date format is not supported I checked the same from the...
Swati Sahu
1

votes
0

answer
20

Views

Filebeat: read logs from a running docker image on mac OS

I have a running docker image that produces some logs, putting them in the default location /var/lib/docker/containers/CONTAINER_ID, and another docker image with Filebeat that should read from the first image. I set the Logstash configuration in the following way: filebeat.inputs: - type: docker c...
Nicolò Pomini
0

votes
0

answer
5

Views

Filebeat can't send logs after Elasticsearch cluster failure

We recently had a problem when ES cluster failed. The problem was resolved, but filebeat failed to send new data after the failure. Here's a portion of the logs - it seems to retry but can't send the data: 2019-04-08T11:52:04.182+0300 INFO elasticsearch/client.go:690 Connected to Elasticse...
chester89
0

votes
0

answer
3

Views

How does filebeat handle reliability when it outputs to Kafka?

It is known that Kafka can provide exactly-once delivery semantic after 0.11.0.0. And I expect to make filebeat output to Kafka that way. According to KIP 98, it requires some configs to be set properly, including enable.idempotence, transaction.timeout.ms and transactional.id for producer (i.e., fi...
iamabug

View additional questions