I use logstash in order to process *.csv files and push them to kafka topic.
Following plugins used:
- logstash-integration-kafka
- logstash-codec-csv
Input file has format (2 fields with separator, no heading):
1234556;fraud
2342342;collection
... 0-250k records ...
But now apart from pushing of the content of the file I have to emit one extra event in the beginning of each file processing and then push it to kafka as well (same or different topic - doesn't matter).
All I managed to thought out is a separate pipeline with multiline codec that will treat a csv file as an event.
Is there any good solution for this?
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…