Elasticsearch(黑马)

这篇具有很好参考价值的文章主要介绍了Elasticsearch(黑马)。希望对大家有所帮助。如果存在错误或未考虑完全的地方,请大家不吝赐教,您也可以点击"举报违法"按钮提交疑问。

初识elasticsearch 

elasticsearch黑马,java

 elasticsearch黑马,java

 elasticsearch黑马,java

 elasticsearch黑马,java

 elasticsearch黑马,java

 ​​.elasticsearch黑马,java

 elasticsearch黑马,java

 elasticsearch黑马,java

elasticsearch黑马,java

 elasticsearch黑马,java

 elasticsearch黑马,java


安装elasticsearch

1.部署单点es

1.1.创建网络

因为我们还需要部署kibana容器,因此需要让es和kibana容器互联。这里先创建一个网络:

docker network create es-net

1.2.加载镜像

这里我们采用elasticsearch的7.12.1版本的镜像,这个镜像体积非常大,接近1G。不建议大家自己pull。

课前资料提供了镜像的tar包:

elasticsearch黑马,java

大家将其上传到虚拟机中,然后运行命令加载即可:

# 导入数据
docker load -i es.tar

同理还有kibana的tar包也需要这样做。

1.3.运行

运行docker命令,部署单点es:

docker run -d \
    --name es \
    -e "ES_JAVA_OPTS=-Xms512m -Xmx512m" \
    -e "discovery.type=single-node" \
    -v es-data:/usr/share/elasticsearch/data \
    -v es-plugins:/usr/share/elasticsearch/plugins \
    --privileged \
    --network es-net \
    -p 9200:9200 \
    -p 9300:9300 \
elasticsearch:7.12.1

命令解释:

  • -e "cluster.name=es-docker-cluster":设置集群名称

  • -e "http.host=0.0.0.0":监听的地址,可以外网访问

  • -e "ES_JAVA_OPTS=-Xms512m -Xmx512m":内存大小

  • -e "discovery.type=single-node":非集群模式

  • -v es-data:/usr/share/elasticsearch/data:挂载逻辑卷,绑定es的数据目录

  • -v es-logs:/usr/share/elasticsearch/logs:挂载逻辑卷,绑定es的日志目录

  • -v es-plugins:/usr/share/elasticsearch/plugins:挂载逻辑卷,绑定es的插件目录

  • --privileged:授予逻辑卷访问权

  • --network es-net :加入一个名为es-net的网络中

  • -p 9200:9200:端口映射配置

在浏览器中输入:http://192.168.150.101:9200 即可看到elasticsearch的响应结果:

elasticsearch黑马,java


2.部署kibana

kibana可以给我们提供一个elasticsearch的可视化界面,便于我们学习。

2.1.部署

运行docker命令,部署kibana

docker run -d \
--name kibana \
-e ELASTICSEARCH_HOSTS=http://es:9200 \
--network=es-net \
-p 5601:5601  \
kibana:7.12.1
  • --network es-net :加入一个名为es-net的网络中,与elasticsearch在同一个网络中

  • -e ELASTICSEARCH_HOSTS=http://es:9200":设置elasticsearch的地址,因为kibana已经与elasticsearch在一个网络,因此可以用容器名直接访问elasticsearch

  • -p 5601:5601:端口映射配置

kibana启动一般比较慢,需要多等待一会,可以通过命令:

docker logs -f kibana

查看运行日志,当查看到下面的日志,说明成功:

elasticsearch黑马,java

2.2.DevTools

kibana中提供了一个DevTools界面:

elasticsearch黑马,java

这个界面中可以编写DSL来操作elasticsearch。并且对DSL语句有自动补全功能。


elasticsearch黑马,java

3.安装IK分词器

3.1.在线安装ik插件(较慢)

# 进入容器内部
docker exec -it elasticsearch /bin/bash
​
# 在线下载并安装
./bin/elasticsearch-plugin  install https://github.com/medcl/elasticsearch-analysis-ik/releases/download/v7.12.1/elasticsearch-analysis-ik-7.12.1.zip
​
#退出
exit
#重启容器
docker restart elasticsearch

3.2.离线安装ik插件(推荐)

1)查看数据卷目录

安装插件需要知道elasticsearch的plugins目录位置,而我们用了数据卷挂载,因此需要查看elasticsearch的数据卷目录,通过下面命令查看:

docker volume inspect es-plugins

显示结果:

[
    {
        "CreatedAt": "2022-05-06T10:06:34+08:00",
        "Driver": "local",
        "Labels": null,
        "Mountpoint": "/var/lib/docker/volumes/es-plugins/_data",
        "Name": "es-plugins",
        "Options": null,
        "Scope": "local"
    }
]

说明plugins目录被挂载到了:/var/lib/docker/volumes/es-plugins/_data这个目录中。

2)解压缩分词器安装包

下面我们需要把课前资料中的ik分词器解压缩,重命名为ik

elasticsearch黑马,java

3)上传到es容器的插件数据卷中

也就是/var/lib/docker/volumes/es-plugins/_data

elasticsearch黑马,java

4)重启容器

# 4、重启容器
docker restart es
# 查看es日志
docker logs -f es

5)测试:

IK分词器包含两种模式:

  • ik_smart:最少切分

  • ik_max_word:最细切分

GET /_analyze
{
  "analyzer": "ik_max_word",
  "text": "黑马程序员学习java太棒了"
}

结果:

{
  "tokens" : [
    {
      "token" : "黑马",
      "start_offset" : 0,
      "end_offset" : 2,
      "type" : "CN_WORD",
      "position" : 0
    },
    {
      "token" : "程序员",
      "start_offset" : 2,
      "end_offset" : 5,
      "type" : "CN_WORD",
      "position" : 1
    },
    {
      "token" : "程序",
      "start_offset" : 2,
      "end_offset" : 4,
      "type" : "CN_WORD",
      "position" : 2
    },
    {
      "token" : "员",
      "start_offset" : 4,
      "end_offset" : 5,
      "type" : "CN_CHAR",
      "position" : 3
    },
    {
      "token" : "学习",
      "start_offset" : 5,
      "end_offset" : 7,
      "type" : "CN_WORD",
      "position" : 4
    },
    {
      "token" : "java",
      "start_offset" : 7,
      "end_offset" : 11,
      "type" : "ENGLISH",
      "position" : 5
    },
    {
      "token" : "太棒了",
      "start_offset" : 11,
      "end_offset" : 14,
      "type" : "CN_WORD",
      "position" : 6
    },
    {
      "token" : "太棒",
      "start_offset" : 11,
      "end_offset" : 13,
      "type" : "CN_WORD",
      "position" : 7
    },
    {
      "token" : "了",
      "start_offset" : 13,
      "end_offset" : 14,
      "type" : "CN_CHAR",
      "position" : 8
    }
  ]
}

elasticsearch黑马,java

elasticsearch黑马,java


elasticsearch黑马,java

elasticsearch黑马,java

 elasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java


elasticsearch黑马,java

 elasticsearch黑马,java

 elasticsearch黑马,java

elasticsearch黑马,java


 elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java


elasticsearch黑马,java elasticsearch黑马,java

 elasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java

 elasticsearch黑马,java

 elasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,javaelasticsearch黑马,java

elasticsearch黑马,java

 elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java

 elasticsearch黑马,java

 elasticsearch黑马,java

elasticsearch黑马,java


elasticsearch黑马,java elasticsearch黑马,java

GET /hotel/_search
{
  "query": {
    "match_all": {}
  },
  "sort": [
    {
      "score": {
        "order": "desc"
      }
    },
    {
      "price": {
        "order": "asc"
      }
    }
  ]
}

 elasticsearch黑马,java

GET /hotel/_search
{
  "query": {
    "match_all": {}
  },
  "sort": [
    {
      "_geo_distance": {
        "location": "31.034661,121.612282",
        "order": "asc",
        "unit": "km"
      }
    }
  ]
}

 elasticsearch黑马,java

 elasticsearch黑马,java

 elasticsearch黑马,java


elasticsearch黑马,java

#高亮查询,默认情况下,ES搜索字段必须与高亮字段一致
GET /hotel/_search
{
  "query": {
    "match": {
      "all":"如家"
    }
  },
  "highlight": {
    "fields": {
      "name": {
        "require_field_match": "false"
      }
    }
  }
}

elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java


 

elasticsearch黑马,java elasticsearch黑马,java

 elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java

elasticsearch黑马,java controller

@RestController
@RequestMapping("/hotel")
public class HotelController {

    @Autowired
    private IHotelService hotelService;

    @PostMapping("/list")
    public PageResult list(@RequestBody RequestParams params) throws IOException {
        System.out.println(params);
        PageResult pageResult = hotelService.search(params);

        return pageResult;
    }
}

 service

@Service
public class HotelService extends ServiceImpl<HotelMapper, Hotel> implements IHotelService {

    @Autowired
    private RestHighLevelClient client;

    @Override
    public PageResult search(RequestParams params) throws IOException {
        //连接elasticsearch
        this.client = new RestHighLevelClient(RestClient.builder(
                HttpHost.create("http://192.168.136.150:9200")
        ));
        //1.准备Resquest
        SearchRequest request = new SearchRequest("hotel");
        //2.组织DSL参数
        buidBasicQuery(params, request);
        //地理位置排序
        String location = params.getLocation();
        if(!StringUtils.isEmpty(location)){
            request.source().sort(SortBuilders
                    .geoDistanceSort("location",new GeoPoint(location))
                    .order(SortOrder.ASC)
                    .unit(DistanceUnit.KILOMETERS)
            );
        }
        String sortBy = params.getSortBy();
        if(!sortBy.equals("default")){
            request.source().sort(sortBy);
        }
        //分页
        Integer page = params.getPage();
        Integer size = params.getSize();
        if(page != null&& size !=null){
            request.source().from((page-1)*size).size(size);
        }
        //地理位置
        //3.发送请求
        SearchResponse response = client.search(request, RequestOptions.DEFAULT);
        System.out.println(response);
        //4.解析结果
        PageResult pageResult = handleResponse(response);
        //关闭连接
        this.client.close();
        return pageResult;
    }

    private void buidBasicQuery(RequestParams params, SearchRequest request) {
        //过滤条件
        BoolQueryBuilder boolQuery = QueryBuilders.boolQuery();
        String key = params.getKey();
        if(!StringUtils.isEmpty(key)){
            boolQuery.must(QueryBuilders.matchQuery("all",key));
        }else{
            boolQuery.must(QueryBuilders.matchAllQuery());
        }
        //品牌
        String brand = params.getBrand();
        if(!StringUtils.isEmpty(brand)){
            boolQuery.must(QueryBuilders.termQuery("brand",brand));
        }
        //城市
        String city = params.getCity();
        if(!StringUtils.isEmpty(city)){
            boolQuery.must(QueryBuilders.termQuery("city",city));
        }
        //星级
        String starName = params.getStarName();
        if(!StringUtils.isEmpty(starName)){
            boolQuery.must(QueryBuilders.termQuery("starName",starName));
        }
        //价钱
        Integer minPrice = params.getMinPrice();
        Integer maxPrice = params.getMaxPrice();
        if(minPrice != null && maxPrice != null){
            boolQuery.filter(QueryBuilders.rangeQuery("price").gte(minPrice).lte(maxPrice));
        }
        //算分控制
        FunctionScoreQueryBuilder functionScoreQuery =
                QueryBuilders.functionScoreQuery(
                        //原始查询
                        boolQuery,
                        //function score数组
                        new FunctionScoreQueryBuilder.FilterFunctionBuilder[]{
                                //其中的一个function score元素
                                new FunctionScoreQueryBuilder.FilterFunctionBuilder(
                                        //过滤条件
                                        QueryBuilders.termQuery("isAD",true),
                                        //算分函数
                                        ScoreFunctionBuilders.weightFactorFunction(10)
                                )
                        });
        request.source().query(functionScoreQuery);
    }

    //解析结果函数
    private PageResult handleResponse(SearchResponse response){
        SearchHits searchHits = response.getHits();
        //1.查询总条数
        Long total = searchHits.getTotalHits().value;
        //2.查询的结果数组
        SearchHit[] hits = searchHits.getHits();
        //3.遍历
        List<HotelDoc> hotels = new ArrayList<>();
        for (SearchHit hit : hits) {
            //获取文档source
            String json = hit.getSourceAsString();
            //反序列化
            HotelDoc hotelDoc = JSON.parseObject(json,HotelDoc.class);
            //获取排序值
            Object[] sortValues = hit.getSortValues();
            if(sortValues.length > 0){
                hotelDoc.setDistance(sortValues[0]);
            }
            //获取高亮结果
            Map<String, HighlightField> highlightFields = hit.getHighlightFields();
            if(!CollectionUtils.isEmpty(highlightFields)){
                //根据字段名获取高亮结果
                HighlightField highlightField = highlightFields.get("name");
                if(highlightField != null){
                    //获取高亮值
                    String name = highlightField.getFragments()[0].string();
                    hotelDoc.setName(name);
                }
            }
            //打印
            hotels.add(hotelDoc);
        }
        //4.构造返回值
        PageResult pageResult = new PageResult(total,hotels);
        return pageResult;
    }
}

 pojo

@Data
@NoArgsConstructor
public class HotelDoc {
    private Long id;
    private String name;
    private String address;
    private Integer price;
    private Integer score;
    private String brand;
    private String city;
    private String starName;
    private String business;
    private String location;
    private String pic;

    private Object distance;
    private Boolean isAD;

    public HotelDoc(Hotel hotel) {
        this.id = hotel.getId();
        this.name = hotel.getName();
        this.address = hotel.getAddress();
        this.price = hotel.getPrice();
        this.score = hotel.getScore();
        this.brand = hotel.getBrand();
        this.city = hotel.getCity();
        this.starName = hotel.getStarName();
        this.business = hotel.getBusiness();
        this.location = hotel.getLatitude() + ", " + hotel.getLongitude();
        this.pic = hotel.getPic();
    }
}

elasticsearch黑马,javaelasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java

# metrics聚合
GET /hotel/_search
{
  "size": 0,
  "aggs": {
    "brandAgg": {
      "terms": {
        "field": "brand",
        "size": 20,
        "order": {
          "scoreAgg.avg": "desc"
        }
      },
      "aggs": {
        "scoreAgg": {
          "stats": {
            "field": "score"
          }
        }
      }
    }
  }
}

 elasticsearch黑马,java

elasticsearch黑马,java


elasticsearch黑马,java

 elasticsearch黑马,java

Controller

@PostMapping("/filters")
    public Map<String, List<String>> filters(@RequestBody RequestParams params) throws IOException {
        System.out.println("filter:"+params);
        Map<String, List<String>> map = hotelService.filters(params);
        System.out.println("Map:"+map);
        return map;
    }

 Service

@Override
    public Map<String, List<String>> filters(RequestParams params) throws IOException {
        //1.创建Request
        SearchRequest request = new SearchRequest("hotel");
        //2.组织DSL语句
        //2.1 query
        buidBasicQuery(params, request);
        //2.2 设置size
        request.source().size(0);
        //2.3 聚合
        buildAggregation(request);
        //3.发送请求
        SearchResponse response = client.search(request,RequestOptions.DEFAULT);
        //4.解析结果
        Map<String,List<String>> map = new HashMap<>();
        //4.1.根据城市名称,获取聚合结果
        List<String> cityList = getAggByName(response,"cityAgg");
        map.put("城市",cityList);
        //4.2.根据星级名称,获取聚合结果
        List<String> starList = getAggByName(response,"starAgg");
        map.put("星级",starList);
        ///4.3.根据品牌名称,获取聚合结果
        List<String> brandList = getAggByName(response,"brandAgg");
        map.put("品牌",brandList);
        return map;
    }


private void buildAggregation(SearchRequest request) {
        //聚合城市
        request.source().aggregation(
                AggregationBuilders
                        .terms("cityAgg")
                        .field("city")
                        .size(100)
        );
        //聚合星级
        request.source().aggregation(
                AggregationBuilders
                        .terms("starAgg")
                        .field("starName")
                        .size(100)
        );
        //聚合品牌
        request.source().aggregation(
                AggregationBuilders
                        .terms("brandAgg")
                        .field("brand")
                        .size(100)
        );
    }

private List<String> getAggByName(SearchResponse response, String aggName) {
        Aggregations aggregations = response.getAggregations();
        Terms brandTerms = aggregations.get(aggName);
        //获取桶
        List<? extends Terms.Bucket> buckets = brandTerms.getBuckets();
        //遍历
        List<String> list = new ArrayList<>();
        for (Terms.Bucket bucket : buckets) {
            //获取key
            String key = bucket.getKeyAsString();
            list.add(key);
        }
        return list;
    }

elasticsearch黑马,java

POST /_analyze
{
  "text": ["如家酒店还不错"],
  "analyzer": "pinyin"
}

 elasticsearch黑马,java

elasticsearch黑马,java

POST /test/_analyze
{
  "text": ["如家酒店还不错"],
  "analyzer": "my_analyzer"
}

# 自定义拼音分词器
PUT /test
{
  "settings": {
    "analysis": {
      "analyzer": { 
        "my_analyzer": { 
          "tokenizer": "ik_max_word",
          "filter": "py"
        }
      },
      "filter": {
        "py": { 
          "type": "pinyin",
          "keep_full_pinyin": false,
          "keep_joined_full_pinyin": true,
          "keep_original": true,
          "limit_first_letter_length": 16,
          "remove_duplicated_term": true,
          "none_chinese_pinyin_tokenize": false
        }
      }
    }
  },
  "mappings": {
    "properties": {
      "name":{
        "type": "text",
        "analyzer": "my_analyzer"
      }
    }
  }
}


POST /test/_doc/1
{
  "id": 1,
  "name": "狮子"
}
POST /test/_doc/2
{
  "id": 2,
  "name": "虱子"
}

GET /test/_search
{
  "query": {
    "match": {
      "name": "shizi"
    }
  }
}

elasticsearch黑马,java

 elasticsearch黑马,java

elasticsearch黑马,java

elasticsearch黑马,java


elasticsearch黑马,java

// 酒店数据索引库
PUT /hotel
{
  "settings": {
    "analysis": {
      "analyzer": {
        "text_anlyzer": {
          "tokenizer": "ik_max_word",
          "filter": "py"
        },
        "completion_analyzer": {
          "tokenizer": "keyword",
          "filter": "py"
        }
      },
      "filter": {
        "py": {
          "type": "pinyin",
          "keep_full_pinyin": false,
          "keep_joined_full_pinyin": true,
          "keep_original": true,
          "limit_first_letter_length": 16,
          "remove_duplicated_term": true,
          "none_chinese_pinyin_tokenize": false
        }
      }
    }
  },
  "mappings": {
    "properties": {
      "id":{
        "type": "keyword"
      },
      "name":{
        "type": "text",
        "analyzer": "text_anlyzer",
        "search_analyzer": "ik_smart",
        "copy_to": "all"
      },
      "address":{
        "type": "keyword",
        "index": false
      },
      "price":{
        "type": "integer"
      },
      "score":{
        "type": "integer"
      },
      "brand":{
        "type": "keyword",
        "copy_to": "all"
      },
      "city":{
        "type": "keyword"
      },
      "starName":{
        "type": "keyword"
      },
      "business":{
        "type": "keyword",
        "copy_to": "all"
      },
      "location":{
        "type": "geo_point"
      },
      "pic":{
        "type": "keyword",
        "index": false
      },
      "all":{
        "type": "text",
        "analyzer": "text_anlyzer",
        "search_analyzer": "ik_smart"
      },
      "suggestion":{
          "type": "completion",
          "analyzer": "completion_analyzer"
      }
    }
  }
}


GET /hotel/_search
{
  "suggest": {
    "title_suggest": {
      "text": "地", 
      "completion": {
        "field": "suggestion", 
        "skip_duplicates": true, 
        "size": 10 
      }
    }
  }
}

elasticsearch黑马,java

elasticsearch黑马,java Controller

@GetMapping("/suggestion")
    public List<String> suggestion(String key) throws IOException {
        System.out.println("suggestion接口被访问了:"+key);
        List<String> list = hotelService.suggestion(key);
        return list;
    }

 Service

@Override
    public List<String> suggestion(String key) throws IOException {
        //1.创建Request
        SearchRequest request = new SearchRequest("hotel");
        //2.组织DSL语句
        request.source().suggest(new SuggestBuilder().addSuggestion(
                "hotelSuggestion",
                SuggestBuilders
                        .completionSuggestion("suggestion")
                        .prefix(key)
                        .skipDuplicates(true)
                        .size(100)
        ));
        //3.发送请求
        SearchResponse response = client.search(request,RequestOptions.DEFAULT);
        //4.解析数据
        Suggest suggest = response.getSuggest();
        //根据名称获取补全结果
        CompletionSuggestion suggestion = suggest.getSuggestion("hotelSuggestion");
        //获取option并遍历
        List<String> list = new ArrayList<>();
        for (CompletionSuggestion.Entry.Option option : suggestion.getOptions()) {
            //获取一个option中的text,也就是补全的词条
            String text = option.getText().string();
            list.add(text);
        }
        return list;
    }

elasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java

elasticsearch黑马,java


elasticsearch黑马,java

发送MQ消息

@PostMapping
    public void saveHotel(@RequestBody Hotel hotel){
        hotelService.save(hotel);
        rabbitTemplate.convertAndSend(MqConstants.HOTEL_EXCHANGE,MqConstants.HOTEL_INSERT_KEY,hotel.getId());
    }

    @PutMapping()
    public void updateById(@RequestBody Hotel hotel){
        if (hotel.getId() == null) {
            throw new InvalidParameterException("id不能为空");
        }
        hotelService.updateById(hotel);
        rabbitTemplate.convertAndSend(MqConstants.HOTEL_EXCHANGE,MqConstants.HOTEL_INSERT_KEY,hotel.getId());
    }

    @DeleteMapping("/{id}")
    public void deleteById(@PathVariable("id") Long id) {
        hotelService.removeById(id);
        rabbitTemplate.convertAndSend(MqConstants.HOTEL_EXCHANGE,MqConstants.HOTEL_DELETE_KEY,id);
    }

监听MQ消息 

public class MqConstants {
    /**
     * 交换机
     */
    public final static String HOTEL_EXCHANGE = "hotel.topic";
    /**
     * 监听新增和修改的队列
     */
    public final static String HOTEL_INSERT_QUEUE = "hotel.insert.queue";
    /**
     * 监听删除的队列
     */
    public final static String HOTEL_DELETE_QUEUE = "hotel.delete.queue";
    /**
     * 新增或修改的RoutingKey
     */
    public final static String HOTEL_INSERT_KEY = "hotel.insert";
    /**
     * 删除的RoutingKey
     */
    public final static String HOTEL_DELETE_KEY = "hotel.delete";
}
@Configuration
public class MqConfig {
    //声明一个交换机
    @Bean
    public TopicExchange topicExchange(){
        return new TopicExchange(MqConstants.HOTEL_EXCHANGE,true,false);
    }
    //定义队列
    @Bean
    public Queue insertQueue(){
        return new Queue(MqConstants.HOTEL_INSERT_QUEUE,true);
    }

    @Bean
    public Queue deleteQueue(){
        return new Queue(MqConstants.HOTEL_DELETE_QUEUE,true);
    }
    //将队列与交换机进行绑定
    @Bean
    public Binding insertQueueBanding(){
        return BindingBuilder.bind(insertQueue()).to(topicExchange()).with(MqConstants.HOTEL_INSERT_KEY);
    }
    @Bean
    public Binding deleteQueueBanding(){
        return BindingBuilder.bind(deleteQueue()).to(topicExchange()).with(MqConstants.HOTEL_DELETE_KEY);
    }
}
@PostMapping
    public void saveHotel(@RequestBody Hotel hotel){
        hotelService.save(hotel);
        rabbitTemplate.convertAndSend(MqConstants.HOTEL_EXCHANGE,MqConstants.HOTEL_INSERT_KEY,hotel.getId());
    }

    @PutMapping()
    public void updateById(@RequestBody Hotel hotel){
        if (hotel.getId() == null) {
            throw new InvalidParameterException("id不能为空");
        }
        hotelService.updateById(hotel);
        rabbitTemplate.convertAndSend(MqConstants.HOTEL_EXCHANGE,MqConstants.HOTEL_INSERT_KEY,hotel.getId());
    }

    @DeleteMapping("/{id}")
    public void deleteById(@PathVariable("id") Long id) {
        hotelService.removeById(id);
        rabbitTemplate.convertAndSend(MqConstants.HOTEL_EXCHANGE,MqConstants.HOTEL_DELETE_KEY,id);
    }

elasticsearch黑马,java

部署es集群

我们会在单机上利用docker容器运行多个es实例来模拟es集群。不过生产环境推荐大家每一台服务节点仅部署一个es的实例。

部署es集群可以直接使用docker-compose来完成,但这要求你的Linux虚拟机至少有4G的内存空间

创建es集群

首先编写一个docker-compose文件,内容如下:

version: '2.2'
services:
  es01:
    image: elasticsearch:7.12.1
    container_name: es01
    environment:
      - node.name=es01
      - cluster.name=es-docker-cluster
      - discovery.seed_hosts=es02,es03
      - cluster.initial_master_nodes=es01,es02,es03
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    volumes:
      - data01:/usr/share/elasticsearch/data
    ports:
      - 9200:9200
    networks:
      - elastic
  es02:
    image: elasticsearch:7.12.1
    container_name: es02
    environment:
      - node.name=es02
      - cluster.name=es-docker-cluster
      - discovery.seed_hosts=es01,es03
      - cluster.initial_master_nodes=es01,es02,es03
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    volumes:
      - data02:/usr/share/elasticsearch/data
    ports:
      - 9201:9200
    networks:
      - elastic
  es03:
    image: elasticsearch:7.12.1
    container_name: es03
    environment:
      - node.name=es03
      - cluster.name=es-docker-cluster
      - discovery.seed_hosts=es01,es02
      - cluster.initial_master_nodes=es01,es02,es03
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    volumes:
      - data03:/usr/share/elasticsearch/data
    networks:
      - elastic
    ports:
      - 9202:9200
volumes:
  data01:
    driver: local
  data02:
    driver: local
  data03:
    driver: local

networks:
  elastic:
    driver: bridge

es运行需要修改一些linux系统权限,修改/etc/sysctl.conf文件

vi /etc/sysctl.conf

添加下面的内容:

vm.max_map_count=262144

然后执行命令,让配置生效:

sysctl -p

通过docker-compose启动集群:

docker-compose up -d

4.2.集群状态监控

kibana可以监控es集群,不过新版本需要依赖es的x-pack 功能,配置比较复杂。

这里推荐使用cerebro来监控es集群状态,官方网址:GitHub - lmenezes/cerebro

课前资料已经提供了安装包:

elasticsearch黑马,java

 解压即可使用,非常方便。

解压好的目录如下:

elasticsearch黑马,java

进入对应的bin目录:

elasticsearch黑马,java

 双击其中的cerebro.bat文件即可启动服务。

elasticsearch黑马,java

 访问http://localhost:9000 即可进入管理界面:

elasticsearch黑马,java

 输入你的elasticsearch的任意节点的地址和端口,点击connect即可:

elasticsearch黑马,java

 绿色的条,代表集群处于绿色(健康状态)。

创建索引库

1)利用kibana的DevTools创建索引库

在DevTools中输入指令:

PUT /itcast
{
  "settings": {
    "number_of_shards": 3, // 分片数量
    "number_of_replicas": 1 // 副本数量
  },
  "mappings": {
    "properties": {
      // mapping映射定义 ...
    }
  }
}

2)利用cerebro创建索引库

利用cerebro还可以创建索引库:

elasticsearch黑马,java

 填写索引库信息:

elasticsearch黑马,java

 点击右下角的create按钮:

elasticsearch黑马,java

查看分片效果

回到首页,即可查看索引库分片效果:

elasticsearch黑马,java

 elasticsearch黑马,java

 elasticsearch黑马,javaelasticsearch黑马,java

elasticsearch黑马,java 

 elasticsearch黑马,java

elasticsearch黑马,java elasticsearch黑马,java

elasticsearch黑马,java 

 

 

 

  文章来源地址https://www.toymoban.com/news/detail-783229.html

到了这里,关于Elasticsearch(黑马)的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处: 如若内容造成侵权/违法违规/事实不符,请点击违法举报进行投诉反馈,一经查实,立即删除!

领支付宝红包 赞助服务器费用

相关文章

  • 黑马程序员--分布式搜索ElasticSearch学习笔记

    黑马视频地址:https://www.bilibili.com/video/BV1LQ4y127n4/ 想获得最佳的阅读体验,请移步至我的个人博客 SpringCloud学习笔记 消息队列MQ学习笔记 Docker学习笔记 分布式搜索ElasticSearch学习笔记 ElasticSearch的作用 ElasticSearch 是一款非常强大的开源搜素引擎,具备非常强大的功能,可以帮

    2024年02月04日
    浏览(47)
  • ElasticSearch基础1——索引和文档。Kibana,RestClient操作索引和文档+黑马旅游ES库导入

    导航: 【黑马Java笔记+踩坑汇总】JavaSE+JavaWeb+SSM+SpringBoot+瑞吉外卖+SpringCloud/SpringCloudAlibaba+黑马旅游+谷粒商城 黑马旅游源码:  https://wwmg.lanzouk.com/ikjTE135ybje 目录 1.初识弹性搜索elasticsearch 1.1.了解ES 1.1.1.elasticsearch的作用 1.1.2.ELK弹性栈 1.1.3.elasticsearch和lucene 1.1.4.搜索引擎技术

    2024年02月01日
    浏览(54)
  • ElasticSearch基础3——聚合、补全、集群。黑马旅游检索高亮+自定义分词器+自动补全+前后端消息同步

    导航: 【黑马Java笔记+踩坑汇总】JavaSE+JavaWeb+SSM+SpringBoot+瑞吉外卖+SpringCloud/SpringCloudAlibaba+黑马旅游+谷粒商城  黑马旅游源码:  GitHub: GitHub - vincewm/hotel: 黑马旅游项目 Gitee: hotel: 黑马旅游项目 目录 1.数据聚合 1.1.聚合的种类 1.2.DSL实现聚合 1.2.1.Bucket聚合语法 1.2.2.聚合结果

    2023年04月16日
    浏览(79)
  • 【ElasticSearch】java 如何连接 elasticsearch

    在 es 与 java 连接中最重要的就是『兼容性矩阵」,要严格按照兼容性矩阵的要求来部署 spring boot 或者 es的版本,否则会有意向不到的错误或者程序无法运行等等。 方式1: 方式2: 参考资料(针对es3.1.x版本):https://github.com/spring-projects/spring-data-elasticsearch/tree/3.1.x https://do

    2024年02月16日
    浏览(34)
  • Elasticsearch写入数据之elasticsearch-java

    在《Elasticsearch8.4.2在windows系统下的安装介绍》中介绍了如何安装ES,那么安装成功后改如何将数据写入到ES呢?写入ES数据的方式有很多,本次将介绍一种写入方式elasticsearch-java来写入数据到ES,elasticsearch-java是官方提供的java sdk写入方式,用户只需要配置相关参数就可以方便

    2024年02月11日
    浏览(48)
  • ElasticSearch系列——Elasticsearch Java API Client

    这是用于Elasticsearch的官方Java API客户端的文档。客户端为所有Elasticsearch API提供强类型请求和响应。我们要注意原来的HighRestAPIClient以及停用了,这是趋势,包括SpringData-ElasticSearch4.4.5之后配合ES8的推出也会更换 https://www.elastic.co/guide/en/elasticsearch/client/java-api-client/7.17/indexing.html

    2024年02月01日
    浏览(42)
  • Elasticsearch Java客户端和Spring data elasticsearch-Elasticsearch文章三

    https://www.elastic.co/cn/ 整合springboot看上一篇文章 一定要对应好版本,Elasticsearch 的不同版本变化是真大, https://docs.spring.io/spring-data/elasticsearch/docs/4.4.10/reference/html/ Springboot: 2.7.10 spring-data-elasticsearch: 4.4.10 spring-boot-starter-data-elasticsearch: 2.7.10 elasticsearch-java: 7.17.9 https://github.com/

    2024年02月14日
    浏览(39)
  • Elasticsearch-06-Elasticsearch Java API Client-Elasticsearch 8.0 的api

    在 Elasticsearch7.15版本之后,Elasticsearch官方将它的 高级客户端 RestHighLevelClient 标记为弃用状态。同时推出了全新的 Java API客户端 Elasticsearch Java API Client ,该客户端也将在 Elasticsearch8.0及以后版本中成为官方推荐使用的客户端。 Elasticsearch Java API Client 支持除 Vector tile search API 和

    2024年04月11日
    浏览(35)
  • javaAPI操作Elasticsearch_elasticsearch 修改字段 java api

    } } import com.zyw.elasticsearchdemo.constants.HotelConstants; import org.apache.http.HttpHost; import org.elasticsearch.action.admin.indices.delete.DeleteIndexRequest; import org.elasticsearch.client.RequestOptions; import org.elasticsearch.client.RestClient; import org.elasticsearch.client.RestHighLevelClient; import org.elasticsearch.client.indices.Create

    2024年04月17日
    浏览(33)
  • 【ElasticSearch】ElasticSearch Java API的使用——常用索引、文档、查询操作(二)

    Elaticsearch ,简称为es,es是一个开源的 高扩展 的 分布式全文检索引擎 ,它可以近乎 实时的存储 、 检索数据; 本身扩展性很好,可以扩展到上百台服务器,处理PB级别(大数据时代)的数据。es也使用java开发并使用Lucene作为其核心来实现所有索引和搜索的功能,但是它的 目的

    2024年01月16日
    浏览(82)

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

博客赞助

微信扫一扫打赏

请作者喝杯咖啡吧~博客赞助

支付宝扫一扫领取红包,优惠每天领

二维码1

领取红包

二维码2

领红包