第一次使用的是 apt-get 的安装方式, 应该是软件源没设置为最新的, 结果安装的版本为1.7x的, 果断删除.
第二次直接将 elasticsearch 的 zip 包下载下来安装.
wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-5.5.1.zipunzip elasticsearch-5.5.1.zipelasticsearch.yml 的 cluster.name 的注释打开, 可以自己根据情况修改名称. 将node.name注释打开. 如果需要非当前主机以外的 ip 可以访问则需要将 network.host的注释打开,并且将值修改为 0.0.0.0, 也可以指定几个固定的 ip 访问, 不同 ip 之间用,号隔开即可. 如果需要修改默认的端口,需要将http.port的注释打开,并且修改后面的值elasticsearch 来启动了.使用
logstash将mysq里的表数据自动同步到elasticsearch
按照官方的文档给出的apt-get方式安装
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
sudo apt-get install apt-transport-https
echo "deb https://artifacts.elastic.co/packages/5.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-5.x.list
sudo apt-get update && sudo apt-get install logstash
因为 logstash-input-jdbc 是使用 ruby 开发的, 所以要修改一下 gem 的源, 不然安装的过程中请求资源的时候会下不动, 因为有些依赖是放在亚马逊上了的.
source 值修改为 https://ruby.taobao.orgremote 值修改为 https://ruby.taobao.org进入到 logstash 的根目录下, 执行
bin/logstash-plugin install logstash-input-jdbc, 等待一会以后就会安装成功.
logstash 下创建文件夹, logstash_jdbc_xxx 根据自己需要建即可mysql-connector-java-5.1.38.jar放到 刚创建的文件夹下jdbc.confjdbc.conf:
input{
stdin {
}
jdbc {
jdbc_connection_string => "jdbc:mysql://192.168.2.19:3306/survey_acquisition_faq"
jdbc_user => "xxx"
jdbc_password => "xxx"
# the path to our downloaded jdbc driver
jdbc_driver_library => "/usr/share/logstash/bin/logstash_jdbc_faq/mysql-connector-java-5.1.38.jar"
# the name of the driver class for mysql
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_paging_enabled => true
jdbc_page_size => "50000"
statement_filepath => "/usr/share/logstash/bin/logstash_jdbc_faq/sql/t_help_document.sql"
schedule => "* * * * *"
type => "faq_help_document"
record_last_run => true
use_column_value => true
tracking_column => "update_time"
clean_run => true
}
jdbc {
jdbc_connection_string => "jdbc:mysql://192.168.2.19:3306/survey_acquisition_faq"
jdbc_user => "xxx"
jdbc_password => "xxx"
# the path to our downloaded jdbc driver
jdbc_driver_library => "/usr/share/logstash/bin/logstash_jdbc_faq/mysql-connector-java-5.1.38.jar"
# the name of the driver class for mysql
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_paging_enabled => true
jdbc_page_size => "50000"
statement_filepath => "/usr/share/logstash/bin/logstash_jdbc_faq/sql/t_question.sql"
schedule => "* * * * *"
type => "faq_help_question"
record_last_run => true
use_column_value => true
tracking_column => "update_time"
clean_run => true
}
jdbc {
jdbc_connection_string => "jdbc:mysql://192.168.2.19:3306/survey_acquisition_faq"
jdbc_user => "xxx"
jdbc_password => "xxx"
# the path to our downloaded jdbc driver
jdbc_driver_library => "/usr/share/logstash/bin/logstash_jdbc_faq/mysql-connector-java-5.1.38.jar"
# the name of the driver class for mysql
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_paging_enabled => true
jdbc_page_size => "50000"
statement_filepath => "/usr/share/logstash/bin/logstash_jdbc_faq/sql/t_video.sql"
schedule => "* * * * *"
type => "faq_help_video"
record_last_run => true
use_column_value => true
tracking_column => "update_time"
clean_run => true
}
}
filter {
json {
source => "message"
remove_field => ["message"]
}
}
output {
elasticsearch {
hosts => "127.0.0.1:9200"
index => "survey-faq"
document_id => "%{id}"
template_overwrite => true
template => "/usr/share/logstash/template/logstash.json"
}
stdout {
codec => json_lines
}
}
template_overwrite和tempalte这两个属性是用来定义分词模板的, 定义的模板为 ik 分词模板, 如果不用 ik 的话, 这两个属性可以删除. 配置 ik 分词需要用下面的步骤安装好 ik 分词插件以后才能使用.
logstash_jdbc_xxx目录下创建一个文件夹sql,用来放 sql 文件.在 sql 目录下创建 xx.sql 文件, 里面放写好的 sql 语句即可. 比如
select * from tablename
elasticsearch中进入到 elasticsearch 的根目录下, 执行
bin/elasticsearch-plugin install https://github.com/medcl/elasticsearch-analysis-ik/releases/download/v5.5.1/elasticsearch-analysis-ik-5.5.1.zip
等待安装完成
重启 elasticsearch 即可