首页
学习
活动
专区
工具
TVP
发布
精选内容/技术社群/优惠产品,尽在小程序
立即前往
您找到你想要的搜索结果了吗?
是的
没有找到

一次线上内存泄露历险

刚进公司那段时间,在敏捷项目制的执行下,需求有条不紊地进行着。某个周末,业务系统反馈群内,操作人员反馈系统不可用,我们急忙寻求运维的帮助,将系统重启并恢复使用。同时排查相关log,检查异常点,但是根据log并没有跟踪出结果。于是想到是否有OOM的dump文件生成,询问运维后,被告知并没有生成。咨询之前的应用负责人,以前也有类似系统不可用情况,但只是偶现。没有办法,根据应用日志查不出结果,只有下次复现时导出dump彻查了。又过去一段时间,故障反馈群里又是一样的问题,于是赶忙麻烦运维把dump生成,然后重启了应用,同时离线对dump进行了分析。

04

发布变更又快又稳?腾讯运维工程师经验首发

导读| 如何让功能缺陷修复快速上线?版本发出问题时怎样快速回退?效率提升后质量掉队?为解决这些常让运维工程师头疼的事情,本栏目特邀腾讯知名运维工程师袁旭东,讲述对象存储COS的发布演进过程,为各位开发者提供业务通用的高效高质变更方法。该业务通过提升灰度自测能力、优化流转时间和并发策略等方法实现提效,同时提出措施保障质量,并设置了一套可度量体系保障持续监控、调优,最终带动发布变更水平上新台阶。 ‍‍‍‍‍ 背景 1)背景诉求 现网发布变更对运维开发工程师来说是最繁重的工作。发布变更的概念、节奏等已经是老生

04

Import Kafka data into OSS using E-MapReduce service

Overview Kafka is a frequently-used message queue in open-source communities. Although Kafka (Confluent) officially provides plug-ins to import data directly from Kafka to HDFS's connector, Alibaba Cloud provides no official support for the file storage system OSS. This article will give a simple example to implement data writes from Kafka to Alibaba Cloud OSS. Because Alibaba Cloud E-MapReduce service integrates a large number of open-source components and docking tools for Alibaba Cloud, in this article, the example is directly run in the E-MapReduce cluster. This example uses the open-source Flume tool as a transit to connect Kafka and OSS. Flume open-source components may also appear on the E-MapReduce platform in the future. Scenario example Next we will name a simple example. If you already have an online Kafka cluster, you can directly jump to Step 4. 1. In the Kafka Home directory, start the Kafka service process. Configure the Zookeeper address in the configuration file to the service address emr-header-1:2181 bin/kafka-server-start.sh config/server.properties 2. Create a Kafka topic with a name of test bin/kafka-topics.sh --create --zookeeper emr-header-1:2181 \ --replication-factor 1 --partitions 1 --topic test 3. Write data to Kafka test topic and the data content is the performance monitoring data of the local machine vmstat 1 | bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test 4. Configure and start the Flume service in the Flume Home directory Create a new configuration file: conf/kafka-example.conf. In specific, specify the source as the corresponding topic for Kafka, and use sink as the HDFS Sinker. Specify the path as the OSS path. Because the E-MapReduce service implements an efficient OSS FileSystem (compatible with Hadoop FileSystem) for us, the OSS path can be specified directly, and the HDFS Sinker data will be automatically written to OSS. # Name the components on this agent a1.sources = source1 a1.sinks = oss1 a1.channels = c1 # Describe/configure

03
领券