We can find some searching functionalities in a lot of web applications. While we might be fine when iterating through a small data set, the performance for more extensive databases can become an issue. Relational databases might prove to be relatively slow when searching through a lot of data.
A solution to the above problem might be Elasticsearch. It is a search engine that highly focuses on performance. When using it, we maintain a separate document-oriented database.
If you are familiar with MongoDB, document-oriented databases will ring a bell for you. In theory, we might use Elasticsearch as a general-purpose database. It wasn’t designed for this purpose, though. If you would like to read more about it, check out this question on Stackoverflow.
Running Elasticsearch
Running Elasticsearch includes maintaining a separate, search-optimized database. Because of that, we need to choose one of the ways to fire it up.
In the second part of this series, we’ve started using Docker Compose. Therefore, a fitting way to start using Elasticsearch would be to do so through Docker. When we go to the official Elasticsearch documentation, we can see an example using Docker Compose. It includes three nodes.
An Elasticsearch cluster is a group of one or more Elasticsearch nodes connected. Each node is an instance of Elasticsearch.
Let’s add the above official configuration to our existing file.
1version: "3"
2services:
3 postgres:
4 container_name: postgres
5 image: postgres:latest
6 ports:
7 - "5432:5432"
8 volumes:
9 - /data/postgres:/data/postgres
10 env_file:
11 - docker.env
12 networks:
13 - postgres
14
15 pgadmin:
16 links:
17 - postgres:postgres
18 container_name: pgadmin
19 image: dpage/pgadmin4
20 ports:
21 - "8080:80"
22 volumes:
23 - /data/pgadmin:/root/.pgadmin
24 env_file:
25 - docker.env
26 networks:
27 - postgres
28
29 es01:
30 image: docker.elastic.co/elasticsearch/elasticsearch:7.9.1
31 container_name: es01
32 environment:
33 - node.name=es01
34 - cluster.name=es-docker-cluster
35 - discovery.seed_hosts=es02,es03
36 - cluster.initial_master_nodes=es01,es02,es03
37 - bootstrap.memory_lock=true
38 - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
39 ulimits:
40 memlock:
41 soft: -1
42 hard: -1
43 volumes:
44 - data01:/usr/share/elasticsearch/data
45 ports:
46 - 9200:9200
47 networks:
48 - elastic
49 es02:
50 image: docker.elastic.co/elasticsearch/elasticsearch:7.9.1
51 container_name: es02
52 environment:
53 - node.name=es02
54 - cluster.name=es-docker-cluster
55 - discovery.seed_hosts=es01,es03
56 - cluster.initial_master_nodes=es01,es02,es03
57 - bootstrap.memory_lock=true
58 - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
59 ulimits:
60 memlock:
61 soft: -1
62 hard: -1
63 volumes:
64 - data02:/usr/share/elasticsearch/data
65 networks:
66 - elastic
67 es03:
68 image: docker.elastic.co/elasticsearch/elasticsearch:7.9.1
69 container_name: es03
70 environment:
71 - node.name=es03
72 - cluster.name=es-docker-cluster
73 - discovery.seed_hosts=es01,es02
74 - cluster.initial_master_nodes=es01,es02,es03
75 - bootstrap.memory_lock=true
76 - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
77 ulimits:
78 memlock:
79 soft: -1
80 hard: -1
81 volumes:
82 - data03:/usr/share/elasticsearch/data
83 networks:
84 - elastic
85
86volumes:
87 data01:
88 driver: local
89 data02:
90 driver: local
91 data03:
92 driver: local
93
94networks:
95 postgres:
96 driver: bridge
97 elastic:
98 driver: bridgeYou might run into an issue when doing the above: es01 exited with code 78. There is a high chance that increasing the vm.max_map_count will help, as described here.
By default, the password for Elasticsearch is changeme. To set up a password, we can add it to our docker.env file:
1(...)
2ELASTIC_PASSWORD=adminThe default username is “elastic“
Connecting to Elasticsearch in NestJS
To use Elasticsearch within our NestJS project, we can use the official @nestjs/elasticsearch library.
It wraps the @elastic/elasticsearch client. Since it is a peer dependency of @nestjs/elasticsearch, we need to install it.
Don’t confuse it with the “elasticsearch” client that will soon be deprecated.
1npm install @nestjs/elasticsearch @elastic/elasticsearchDue to how we did set up Elesticsearch, our cluster is available at http://localhost:9200. Our username is elastic, and the password is admin. We need to add all of the above to our environment variables.
1(...)
2ELASTICSEARCH_NODE=http://localhost:9200
3ELASTICSEARCH_USERNAME=elastic
4ELASTICSEARCH_PASSWORD=adminNow we can create our module that uses the above configuration.
1import { Module } from '@nestjs/common';
2import { ConfigModule, ConfigService } from '@nestjs/config';
3import { ElasticsearchModule } from '@nestjs/elasticsearch';
4
5@Module({
6 imports: [
7 ConfigModule,
8 ElasticsearchModule.registerAsync({
9 imports: [ConfigModule],
10 useFactory: async (configService: ConfigService) => ({
11 node: configService.get('ELASTICSEARCH_NODE'),
12 auth: {
13 username: configService.get('ELASTICSEARCH_USERNAME'),
14 password: configService.get('ELASTICSEARCH_PASSWORD'),
15 }
16 }),
17 inject: [ConfigService],
18 }),
19 ],
20 exports: [ElasticsearchModule]
21})
22export class SearchModule {}We export the ElasticsearchModule above so that we are able to use some of its features when importing SearchModule as suggested here
Populating Elasticsearch with data
The first thing to consider when populating Elasticsearch with data is the concept of the index. In the context of Elasticsearch, we group similar documents by assigning them with the same index.
In the previous versions of Elasticsearch we also used types to group documents, but this concept is being abandoned
When populating the Elasticsearch database with data, we throw in only the parts that we later use when searching. Let’s create an interface for that purpose.
1interface PostSearchBody {
2 id: number,
3 title: string,
4 content: string,
5 authorId: number
6}The TypeScript support with Elasticsearch is not that good, unfortunately. Following the official documentation, we can create a search response type for our posts.
1import PostSearchBody from './postSearchBody.interface';
2
3interface PostSearchResult {
4 hits: {
5 total: number;
6 hits: Array<{
7 _source: PostSearchBody;
8 }>;
9 };
10}When we’re done with the above, we can create a service that takes care of interacting with our Elasticsearch cluster.
1import { Injectable } from '@nestjs/common';
2import { ElasticsearchService } from '@nestjs/elasticsearch';
3import Post from './post.entity';
4import PostSearchResult from './types/postSearchResponse.interface';
5import PostSearchBody from './types/postSearchBody.interface';
6
7@Injectable()
8export default class PostsSearchService {
9 index = 'posts'
10
11 constructor(
12 private readonly elasticsearchService: ElasticsearchService
13 ) {}
14
15 async indexPost(post: Post) {
16 return this.elasticsearchService.index<PostSearchResult, PostSearchBody>({
17 index: this.index,
18 body: {
19 id: post.id,
20 title: post.title,
21 content: post.content,
22 authorId: post.author.id
23 }
24 })
25 }
26
27 async search(text: string) {
28 const { body } = await this.elasticsearchService.search<PostSearchResult>({
29 index: this.index,
30 body: {
31 query: {
32 multi_match: {
33 query: text,
34 fields: ['title', 'content']
35 }
36 }
37 }
38 })
39 const hits = body.hits.hits;
40 return hits.map((item) => item._source);
41 }
42}Above we use multi_match becase we want to search both through the title and the content of the posts
The crucial thing to acknowledge about elasticsearchService.search is that it returns just the properties that we’ve put into the Elasticsearch database. Since we save the ids of the posts, we can now get the whole documents from our Postgres database. Let’s put this logic into PostsService.
1import { Injectable } from '@nestjs/common';
2import CreatePostDto from './dto/createPost.dto';
3import Post from './post.entity';
4import { InjectRepository } from '@nestjs/typeorm';
5import { Repository, In } from 'typeorm';
6import User from '../users/user.entity';
7import PostsSearchService from './postsSearch.service';
8
9@Injectable()
10export default class PostsService {
11 constructor(
12 @InjectRepository(Post)
13 private postsRepository: Repository<Post>,
14 private postsSearchService: PostsSearchService
15 ) {}
16
17 // (...)
18
19 async createPost(post: CreatePostDto, user: User) {
20 const newPost = await this.postsRepository.create({
21 ...post,
22 author: user
23 });
24 await this.postsRepository.save(newPost);
25 this.postsSearchService.indexPost(newPost);
26 return newPost;
27 }
28
29 async searchForPosts(text: string) {
30 const results = await this.postsSearchService.search(text);
31 const ids = results.map(result => result.id);
32 if (!ids.length) {
33 return [];
34 }
35 return this.postsRepository
36 .find({
37 where: { id: In(ids) }
38 });
39 }
40}The last thing to do is to modify the controller so that it accepts a query parameter.
1import {
2 Controller,
3 Get,
4 UseInterceptors,
5 ClassSerializerInterceptor, Query,
6} from '@nestjs/common';
7import PostsService from './posts.service';
8
9@Controller('posts')
10@UseInterceptors(ClassSerializerInterceptor)
11export default class PostsController {
12 constructor(
13 private readonly postsService: PostsService
14 ) {}
15
16 @Get()
17 async getPosts(@Query('search') search: string) {
18 if (search) {
19 return this.postsService.searchForPosts(search);
20 }
21 return this.postsService.getAllPosts();
22 }
23
24 // (...)
25
26}Don’t forget to import the SearchModule in the PostsModule.
Keeping Elasticsearch consistent with our database
Through our API, we can also edit and delete posts. Therefore, we need to put some effort into keeping the Elasticsearch database consistent with our Postgres instance.
Deleting documents
Since we save the id of the post in our Elasticsearch database, we can use it to find it and delete it. To do so, we can use the deleteByQuery function.
1import { Injectable } from '@nestjs/common';
2import { ElasticsearchService } from '@nestjs/elasticsearch';
3
4@Injectable()
5export default class PostsSearchService {
6 index = 'posts'
7
8 constructor(
9 private readonly elasticsearchService: ElasticsearchService
10 ) {}
11
12 // (...)
13
14 async remove(postId: number) {
15 this.elasticsearchService.deleteByQuery({
16 index: this.index,
17 body: {
18 query: {
19 match: {
20 id: postId,
21 }
22 }
23 }
24 })
25 }
26}Let’s call the above method in PostsService every time we delete a post.
1import { Injectable } from '@nestjs/common';
2import Post from './post.entity';
3import { InjectRepository } from '@nestjs/typeorm';
4import { Repository, In } from 'typeorm';
5import PostNotFoundException from './exceptions/postNotFound.exception';
6import PostsSearchService from './postsSearch.service';
7
8@Injectable()
9export default class PostsService {
10 constructor(
11 @InjectRepository(Post)
12 private postsRepository: Repository<Post>,
13 private postsSearchService: PostsSearchService
14 ) {}
15
16 // (...)
17
18 async deletePost(id: number) {
19 const deleteResponse = await this.postsRepository.delete(id);
20 if (!deleteResponse.affected) {
21 throw new PostNotFoundException(id);
22 }
23 await this.postsSearchService.remove(id);
24 }
25}Modifying documents
The other thing to make sure that the Elasticsearch database is consistent with our main database is to modify existing documents. To do that, we can use the updateByQuery function.
Unfortunately, we need to write a script that updates all of the necessary fields. For example, to update the title and the content, we need:
1ctx._source.title='New title'; ctx._source.content= 'New content';We can create the above script dynamically.
1import { Injectable } from '@nestjs/common';
2import { ElasticsearchService } from '@nestjs/elasticsearch';
3import Post from './post.entity';
4import PostSearchBody from './types/postSearchBody.interface';
5
6@Injectable()
7export default class PostsSearchService {
8 index = 'posts'
9
10 constructor(
11 private readonly elasticsearchService: ElasticsearchService
12 ) {}
13
14 // (...)
15
16 async update(post: Post) {
17 const newBody: PostSearchBody = {
18 id: post.id,
19 title: post.title,
20 content: post.content,
21 authorId: post.author.id
22 }
23
24 const script = Object.entries(newBody).reduce((result, [key, value]) => {
25 return `${result} ctx._source.${key}='${value}';`;
26 }, '');
27
28 return this.elasticsearchService.updateByQuery({
29 index: this.index,
30 body: {
31 query: {
32 match: {
33 id: post.id,
34 }
35 },
36 script: {
37 inline: script
38 }
39 }
40 })
41 }
42}Now we need to use the above method whenever we modify existing posts.
1import { Injectable } from '@nestjs/common';
2import Post from './post.entity';
3import UpdatePostDto from './dto/updatePost.dto';
4import { InjectRepository } from '@nestjs/typeorm';
5import { Repository, In } from 'typeorm';
6import PostNotFoundException from './exceptions/postNotFound.exception';
7import PostsSearchService from './postsSearch.service';
8
9@Injectable()
10export default class PostsService {
11 constructor(
12 @InjectRepository(Post)
13 private postsRepository: Repository<Post>,
14 private postsSearchService: PostsSearchService
15 ) {}
16
17 async updatePost(id: number, post: UpdatePostDto) {
18 await this.postsRepository.update(id, post);
19 const updatedPost = await this.postsRepository.findOne(id, { relations: ['author'] });
20 if (updatedPost) {
21 await this.postsSearchService.update(updatedPost);
22 return updatedPost;
23 }
24 throw new PostNotFoundException(id);
25 }
26}The Elasticsearch documents also have ids. An alternative to the above deletes and updates would be to store the Elasticsearch id in our Postgres database and use it when deleting and updating.
Summary
Today we’ve learned the very basics of Elasticsearch. When doing so, we’ve added it to our NestJS API. We’ve also created our documents and searched through them. All of that is the tip of the Elasticsearch iceberg. There is a lot more to learn here, so stay tuned!