写点什么

springcloud 微服务日志写入 kafka

用户头像
Rubble
关注
发布于: 刚刚
springcloud 微服务日志写入kafka

SpringCloud 微服务日志可以写入 file 再通过 filebeat 写入 logstash, 或者直接写入 logstash。日志写入 kafka,可以利用 kafak 的高吞吐量高性能来降低系统延迟,之后再异步写入 logstash.

启动 kafka 服务

启动 zookeeper

./bin/zookeeper-server-start.sh ./config/zookeeper.properties

启动 kafka

./bin/kafka-server-start.sh ./config/server.properties

创建 topic

bin/kafka-topics.sh --create --topic logger-channel --bootstrap-server localhost:9092

消费端

./kafka-console-consumer.sh --topic logger-channel --bootstrap-server localhost:9092

springBoot 服务

引入依赖

org.apache.kafka kafka-clients 2.8.0

编写 logback-spring.xml

使用了 LayoutKafkaMessageEncoder 进行消息编码 AsyncAppender 异步日志 topic 指定主题 logger-channel

<?xml version="1.0" encoding="UTF-8"?><!-- scan:当此属性设置为true时,配置文件如果发生改变,将会被重新加载,默认值为true。 scanPeriod:设置监测配置文件是否有修改的时间间隔,如果没有给出时间单位, 默认单位是毫秒当scan为true时,此属性生效。默认的时间间隔为1分钟。 debug:当此属性设置为true时,将打印出logback内部日志信息,实时查看logback运行状态。 默认值为false。 --><!-- <configuration scan="false" scanPeriod="60 seconds" debug="false"> --><configuration>
<!--设置上下文名称,用于区分不同应用程序的记录。一旦设置不能修改, 可以通过%contextName来打印日志上下文名称 --> <contextName>kafka-log-test</contextName>
<property name="applicationName" value="${spring.application.name:-paw-kelk}"/> <!-- 定义日志的根目录 --> <property name="logDir" value="/Users/rubble/logs/${applicationName}" /> <!-- 定义日志文件名称 --> <property name="logName" value="${applicationName}"/> <!-- profile --> <property name="profileActive" value="${spring.profile.active:-dev}" />
<!-- ConsoleAppender 表示控制台输出 --> <appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender"> <!-- 日志输出格式: %d表示日期时间, %thread表示线程名, %-5level:级别从左显示5个字符宽度, %logger{50} 表示logger名字最长50个字符,否则按照句点分割。 %msg:日志消息, %n是换行符 --> <encoder> <pattern>%d{HH:mm:ss.SSS} %contextName [%thread] %-5level %logger{36} - %msg%n</pattern> </encoder> </appender>
<!-- 异常错误日志记录到文件 --> <appender name="logfile" class="ch.qos.logback.core.rolling.RollingFileAppender"> <!-- <Encoding>UTF-8</Encoding> --> <File>${logDir}/${logName}.log</File> <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> <FileNamePattern>${logDir}/history/${myspringboottest_log}.%d{yyyy-MM-dd}.rar</FileNamePattern> <maxHistory>30</maxHistory> </rollingPolicy> <encoder> <pattern>%d{HH:mm:ss.SSS} %contextName [%thread] %-5level %logger{36} - %msg%n</pattern> </encoder> </appender>

<appender name="kafkaAppender" class="com.github.danielwegener.logback.kafka.KafkaAppender"> <encoder class="com.github.danielwegener.logback.kafka.encoding.LayoutKafkaMessageEncoder"> <layout class="net.logstash.logback.layout.LogstashLayout" > <includeContext>false</includeContext> <includeCallerData>true</includeCallerData> <customFields>{"appName":"${applicationName}","env":"${profileActive}"}</customFields> <fieldNames class="net.logstash.logback.fieldnames.ShortenedFieldNames"/><!-- <fieldNames class="net.logstash.logback.fieldnames.LogstashFieldNames"/>--> </layout> <charset>UTF-8</charset> </encoder> <!--kafka topic 需要与配置文件里面的topic一致 否则kafka会沉默并鄙视你--> <topic>logger-channel</topic> <keyingStrategy class="com.github.danielwegener.logback.kafka.keying.HostNameKeyingStrategy" /> <deliveryStrategy class="com.github.danielwegener.logback.kafka.delivery.AsynchronousDeliveryStrategy" /> <producerConfig>bootstrap.servers=localhost:9092</producerConfig> <!-- 如果kafka不可用则输出到 appender --> <appender-ref ref="logfile" /> </appender>

<!--异步写入kafka,尽量不占用主程序的资源--> <appender name="ASYNC" class="ch.qos.logback.classic.AsyncAppender"> <neverBlock>true</neverBlock> <includeCallerData>true</includeCallerData> <discardingThreshold>0</discardingThreshold> <queueSize>2048</queueSize> <appender-ref ref="kafkaAppender" /> </appender>
<root level="INFO"> <appender-ref ref="CONSOLE" /><!-- <appender-ref ref="logfile"/>--> <!-- kafka 日志 --> <appender-ref ref="ASYNC" /> </root>
</configuration>
复制代码


测试类 controller

@GetMapping("/kafka")public String kafka(){  int time = RandomUtil.randomInt(0, 100);  log.info("cost time: "+time);  log.debug("debug time: "+time);  return "cost time: "+time;}
复制代码


消费端输出两条消息 info、debug

{"@timestamp":"2021-08-01T14:47:52.068+08:00","@version":1,"message":"cost time: 32","logger":"com.paw.kafka.elk.controller.KafkaLogController","thread":"http-nio-8080-exec-2","level":"INFO","levelVal":20000,"caller":{"class":"com.paw.kafka.elk.controller.KafkaLogController","method":"kafka","file":"KafkaLogController.java","line":35},"appName":"paw-kelk","env":"dev"}{"@timestamp":"2021-08-01T14:47:52.068+08:00","@version":1,"message":"debug time: 32","logger":"com.paw.kafka.elk.controller.KafkaLogController","thread":"http-nio-8080-exec-2","level":"DEBUG","levelVal":10000,"caller":{"class":"com.paw.kafka.elk.controller.KafkaLogController","method":"kafka","file":"KafkaLogController.java","line":36},"appName":"paw-kelk","env":"dev"}
复制代码


springcloud 服务的日志已写入到 kafka 中。

用户头像

Rubble

关注

还未添加个人签名 2021.06.01 加入

还未添加个人简介

评论

发布
暂无评论
springcloud 微服务日志写入kafka