NB Store Cart table grows

Editor
Nov 5, 2013 at 9:25 AM
Im trying to solve a db problem for a client and during that process I pulled a list of all the table sizes in the db, then I noticed this:

Image

The cart table containing over 700,000 rows and being over 120MB in size.
I opened the table and sorted it by "CreatedDate" to find that the oldest entries were from mid august this year and hence almost 3 months old.

I went to the scheduler on the website and chose to run the cleanup assembly for the cart, which went successful, yet it didn't reduce the cart table size and the 3 months old entries are still in there.

Is there a way to manually clean it out, like simply deleting all rows in there thru MSSQLMS?


Thx
Jake
Coordinator
Nov 5, 2013 at 9:36 AM
hi jake,

Al that clear old carts sproc if doing is this...
delete from dbo.[NB_Store_Cart] 
where DateCreated < dateadd(minute,(@CartMins * -1),getdate())
and PortalID = @PortalID

What version of the store are you using, and what do you have in the "cookiecart.expire" setting? If empty is should do 30 days, maybe something is wrong with the default???...also check for any errors in the event log.

Dave.
Editor
Nov 6, 2013 at 6:03 AM
Edited Nov 6, 2013 at 6:35 AM
Store version is 2.3.6
DNN version 6.2.5
And cookiecart.expire isn't set (I just set it at 30 now) ;-)

I just checked right now and the cart table has gone up since yesterday by approx. 11,000 rows and 4MB.

Also checked the log viewer and I have issues with scheduler exceptions due to the transaction log being full it says.
I seem to recall ive had that issue before and I googled it just to find out, that I cant clear the transaction log in any way...


Thx
Jake
Editor
Nov 12, 2013 at 8:32 AM
Edited Nov 15, 2013 at 8:27 AM
I investigated the matter a bit further and took a look at my own database.

I don't run an active shop on my site, I use it only for testing purposes and I don't even provide a link to the page on my site.
Yet the cart table has managed to grow to (hold on now) 781,327 rows, well that's the number before I started writing this post, because as of right now, it has increased by 17 rows already.

Knowing hardly anything about the scheduler I don't know what I did wrong, but apparently I missed something, otherwise it would've cleaned out the cart table right?

Btw, when I try to run the script u provided, I get an error saying "Must declare the scalar variable "@CartMins"?

Cheers
Jake

PS: At the time I started writing this ps here, the cart table has grown by 87 rows since I created the comment. :-D
Coordinator
Nov 12, 2013 at 9:28 AM
Hi Jake,

That's odd there must be something triggering those cart records...maybe a robot???...but I'm surprised, that usually don't!!!

I've just check to of my sites and 1 looks OK, quite a few cart but nothing worrying....however the other have a massive cart table!!!...so there is a problem somewhere.

I tried the clear cart scheduler like you, and Against it didn't work!!!.....I have a error on the event log about

Arithmetic overflow error converting expression to data type nvarchar.

Do you have anything similar in your event log?

@Robax, Have you seen this on your installation???

that error on the SQL is because it's not suppose to be run without the vars being created...I ran this to clear my log.....

delete from dbo.[NB_Store_Cart]
where DateCreated < dateadd(minute,(15000 * -1),getdate())
and PortalID = 0

Just make sure you've got the right portalid specified. I'll have a look to see if I can recreate this issue on my test system!

Dave.
Coordinator
Nov 12, 2013 at 9:52 AM
OK,

On my test system, everything works as expected, one 1 cart per user and nothing seems to be creating phantom carts.

On the on-line demo store I have noticed it's created over 100 carts today.....this is probably robots creating these...something that is a issue, but not a big one if the scheduler is running.

I tested the scheduler on the demo site and it works correctly, so we have a situation on some sites with the scheduler failing and the robots creating cart records.
Coordinator
Nov 12, 2013 at 10:17 AM
Hi Jake,

What version of NB_Store are you using, Robax has reminded me of a fix we did. It was in v2.2.

However that site I have a issue with is on v2.3.2, so I'm wondering if something else is stopping the scheduler from working?

The issue with the robot creating carts records is not ideal, but we can live with that. But we need to understand what's happening with the scheduler.

Dave.
Coordinator
Nov 12, 2013 at 10:23 AM
sorry that fix was if v2.3.2 not v2.2 like I said above.
Editor
Nov 12, 2013 at 10:24 AM
Hi guys,

just to add to this...

a year or 2 ago there was a similar problem and it was connected to the portal ID of '0' if I remember.

Don't know if this helps you all out.

Geoff
Editor
Nov 12, 2013 at 11:36 AM
Edited Nov 12, 2013 at 11:53 AM
I decided to uninstall and do a fresh install on my own site and see what happens from there.

In regards of my client, his version is 2.3.6
I tried running the script u posted Dave, but I run into that transaction log being full issue, so the way would prolly be to query with select instead of delete and then manually select ever so many at a time and press the delete key, all thru MSSQLMS?

I figure its the issue the scheduler fails, because deleting 680,000 records fills up the transaction log.

Theres only records for portal 0 and the DNN site only has default and 0 under the portals folder.


Cheers
Jake
Coordinator
Nov 12, 2013 at 12:41 PM
OK Jake,

I've check that error I was getting on my site and it's related to googlebot search pages that don't exist, something that was fixed in a later release than I'm running. So that doesn't have anything to do with it.

It's odd that my scheduler was saying the cart clean was OK, but checking the DB I didn't see the records removed. So either the amount of records in there stopped the cart clean up or there is something not quite right!!...but it's odd it works on my dev machine and on the demo site.

You can try turning off the translation loggin by putting your DB in simple mode. e.g......

ALTER DATABASE MyDB SET RECOVERY SIMPLE;

But simply backing up you DB should reduce you translation log file size anyway. So you could try a backup and then do the delete straight after.

I've cleared down the cart table of the site we have, I'm going to monitor it to see if it starts to fill up again. the setting that controls the purge time is called "purgecartmins" not what I said above, that's to do with the cookie expire time.

Anyway try and recreate the failure of the scheduler, get back to me with what happens, if you can recreate I'll produce a special debug dll, for the store so we can try and trace exactly what's happening.

Regards,
Dave.
Editor
Nov 12, 2013 at 1:12 PM
Well I checked the db and it is already set to simply mode.
It runs in a hosted environment and therefore I have no rights to start the backup process.
However the hosting provider say they run daily backup, so that shd take care of it.

My thought is though, that the transaction log isn't full the whole time, but that it fills up during the deletion of those 680,000 records, because deleting other things doesn't cause any problem.

I recall that some time ago I tried to clear the searchitemword thing that had gone off the scale and I got the transaction log error, so I simply opened the table for editing and chose like 100,000 records at a time and deleted them and I remember that it was about the highest amount I could delete at a time without getting the transaction log error.

Thx for the quick reply and getting in to this Dave, I really appreciate it. :-)

Cheers
Jake
Editor
Nov 12, 2013 at 5:01 PM
Ok i guess i fixed the problem for now.

I deleted the cart rows manually thru MSSQLSMS using the script u provided Dave, yet it failed because it was trying to delete over 600,000 rows and that made the transaction log fill up before it got thru them all.

So I just adjusted the number "15000" to a bigger number in order to decrease the amount of records returned and then I repeated the deletion step over and over with a higher creationdate span until I hit the 15000 and hence deleted the excessive rows in small chunks. :-)

After that checked and the table had been reduced from over 700,000 to just about 100,000 rows, so I logged on to the website and ran the cleanup assembly thru the scheduler and that reduced the table to 77,000 rows, basically 1/10 of its initial size.

Ill chk the website over the next couple of days to see if the scheduler run as planned. :-)


Cheers
Jake
Nov 13, 2013 at 10:18 AM
Hi, there is also a mistake in the scheduled task name, it should contain nb_store, not nbstore. Please see the end of this discussion: https://nbstore.codeplex.com/discussions/442389

Franco
Coordinator
Nov 13, 2013 at 11:50 AM
Ah!....yes I forgot about that thx for pointing it out, Franco.

@Jake, did you check this?
Editor
Nov 13, 2013 at 2:51 PM
Yeah I saw Francos comment ealier today so I went in and checked and it seems I got the right one. :-)
Also when I started it manually yesterday, it ran smoothly and reduced the cart table even further than I had done thru MSSQLMS.

I assume the scheduler will attempt to run again at the same time I ran it last night as its set run once every 24 hr, so itll check the scheduler history later or tomorrow and see if everything appears good.

Ill make sure to do a follow up comment here with a summary of what error I encountered and how it was solved, as that's easier for a short overview for others to use in the future if necessary. :-)


Cheers
Jake
Editor
Nov 15, 2013 at 8:59 AM
Edited Nov 15, 2013 at 9:04 AM
As promised, a summary after monitoring the site and its db for a couple of days.


Issue:
The cart table were growing and the scheduler failed when trying to clean it up.
Resulting in the db getting filled up with unused carts, causing the site to become unstable and eventually fail.


Here's what I did:

1. Create a scheduled run to clean out the cart table for old carts.

1.1
The scheduled run was created to run once a day and with a retry delay of 1 hour:
Image

1.2
With these settings:
Image


2. Manually remove the old carts thru MSSQLSMS.

2.1
In MSSQLSMS i ran this SQL Query:
DELETE FROM dbo.[NB_Store_Cart] 
WHERE DateCreated < dateadd(minute,(15000 * -1),getdate()) 
AND PortalID = 0 
As the cart table had more than 700,000 rows in it, the transaction log for the db filled up while deleting the excessive ones (little more than 600,000 rows).
To get around that, I lowered the tolerance for the "DateCreated" value used in the selection of records, by raising the number 15000 to 150000 and then run the script, lower the number by 30000, run the script again and keep repeating that until I reached 15000.


Conclusion:
After manually removing the major part of unwanted records from the cart table, the transaction log didnt fill up no more, allowing the scheduler to do its job without failing.
It now runs as planned, preventing unwanted excessive growth of the cart table.



Thx for all the inputs in this thread. :-)
Coordinator
Nov 15, 2013 at 10:29 AM
OK thx for the feedback Jake. Al we need now is a cast iron method of stopping robots creating carts :-)
Editor
Nov 15, 2013 at 11:14 AM
Edited Nov 15, 2013 at 11:24 PM
Dave
A possible solution could be to check the UserAgent string against a list of known bots at the time of creating the cart.

For example if UserAgent contains "GoogleBot" then the cart is not created.

For example the following are useragent strings for some of the bots...

Googlebot-News
Googlebot-Image
Googlebot-Video
Googlebot-Mobile
Mediapartners-Google
Mediapartners
AdsBot-Google
Googlebot
googlemediabot
bingbot
msnbot-media
msnbot
MSNPTC
gulliver
Yahoo
Slurp
FAST-WebCrawler
fast
vscooter
scooter
ia_archiver
Nutch
Yandex
Teoma
Lycos_Spider
ArchitextSpider
ZyBorg
SurveyBot
AlexaBot
Baiduspider
JikeSpider
aport
StackRambler

We have used our firewall to block a number of bots which are either too aggressive, ignore robots restrictions or simply add no value to our marketing, blocking those bots will reduce load on your server and further reduce the cart growth issue. We tend to let google and bing index our site and restrict others via our firewall.


Craig
Editor
Nov 29, 2013 at 8:01 AM
Blocking bots using ur firewall is only an option when u have control of the firewall, but for a hosted environment its not feasible.

However, afaik u can just create a file named "robots.txt"-file in your web root and have that file define the restrictions for bots.

I don't have much experience with it yet, but I tried it recently to prevent bots from scanning pages like "login.aspx" as I don't really need my clients websites being indexed with words like "Password", "Login" etc. :-)


Cheers
Jake
Editor
Nov 29, 2013 at 8:44 AM
DNN 7.1.2 has IP Filter which may help block ranges of IP addresses too.

Craig