data persistence

19 views
Skip to first unread message

Vishal Kaushik

unread,
Aug 7, 2019, 3:37:31 AM8/7/19
to Angular and AngularJS discussion

Hi guys,

I am kind of new to Angular and working on already developed Angular 2 + .Net Rest API project. There are quite a few search screens in my project which are getting page by page records. Rest API are getting data from SQL using entity framework. based on search parameters, the record count can go upto 2 million but getting 10 records at one time is slow for two reasons - 1. the data in table is a lot (over 5 millions ) , 2. Hitting database for every 10 records do not make sense. 
So I want to change all the searches. My idea is to bring about 500 (or 1000) records at once, and persist those records locally somehow. Instead of hitting database on every 10 records, I would like to let user move thru all 500 records ( 50 pages of 10 records each), then hit database for next 500 records. I want to find out whats the best approach for doing this in Angular 2.

any guidance will be appreciated.

Thanks
KVishal

Tito

unread,
Aug 7, 2019, 8:07:14 AM8/7/19
to Angular and AngularJS discussion
Out of the 2 million records which ones are pertinent to the user?
What kind of searches are you doing? Wildcard ```like '%search term%'

Vishal Kaushik

unread,
Aug 7, 2019, 9:34:46 AM8/7/19
to Angular and AngularJS discussion
Its a search based on parameters selected by the user. There are about 10 parameters. Not all searches lead to 2 million but worse case scenarios does. It uses restful API to pull records based on offset so it fetches only 10 records at a time based on page user is on and if user has clicked next or previous. Problem is hitting database on every 10 records. 
I would like to get about 1000 records at a time and then cache them locally somehow so I can use those to traverse thru those records without hitting the database. In return, this would substantially reduce the database calls. As I am new to Angular, I am not aware of possible features that can help me to implement my idea.

Pls suggest if you can think of something.

Thanks

Tito

unread,
Aug 7, 2019, 2:49:41 PM8/7/19
to Angular and AngularJS discussion
Answer this first.
1. Are you doing any wild card searches.
2. What backend is it?

Your issue most probably is not angular. Sounds like backend issue

Vishal Kaushik

unread,
Aug 8, 2019, 5:00:03 AM8/8/19
to Angular and AngularJS discussion


There are no wildcard searches. The search is based on selecting proper values from over 12 dropdown list. Sql Server is the database. I dont really have an issue except that i dont like current approach of hitting database for every 10 records. I want to be able to hold 1000+ records in memory and show those spread in 100 pages - 10 records per page. and if user wants to see more, then again hit the database to get another/next 1000 records. 
Front end is Angular, which is calling .Net Rest APIs. These APIs connect to SQL via Entity Framework 6. 

Tito

unread,
Aug 8, 2019, 11:44:48 AM8/8/19
to Angular and AngularJS discussion
I would go with redis caching.



Sander Elias

unread,
Aug 14, 2019, 4:39:05 AM8/14/19
to Angular and AngularJS discussion
Hi Vishal,

The size of the pages of your request and the side of the pages on the UI doesn't need to be in sync. There is no issue in caching a couple of thousands of rows in your frontend services.  
it takes some planning and calculation. for example, if you keep 10 pages of data in the buffer, you might want to get the next batch when you are hitting page 8 or 9. 

Regards
Sander
Reply all
Reply to author
Forward
0 new messages