News:

You may pay someone to create your store, or you visit our seminar and become a professional yourself with the silver certification

Main Menu

high CPU load

Started by EIF, April 17, 2013, 08:42:51 AM

Previous topic - Next topic

EIF

My website is running at a shared host and has some performance issues.
I am able to view the CPU load, which is -constantly- between 25% and 50%. That is way too high, considering there are no many vistors.
Some times the CPU loads spikes to 100%, so my hosting provider is automatically limiting my CPU usage, so other websites can still operate.

What is the best way to find out what (plugin?) is the cause of this CPU load?

franzpeter

Do you use a component or a plugin,  which has something to do with automatic SEO optimization, that means which can highlight keywords a.s.o.? Another thing is dynamic sitemap creators. Xmap for example can bring every host to the edge of CPU load, JCrawler sometimes produces even 500 errors. So if you give crawlers the link to dynamically create sitemaps, they call that script and produce high CPU load if they call that script.

EIF

No, I don't use SEO optimization plugins at the moment, only the standaard VM SEF. Nor I am using JCrawel. But i DO use Xmap!
I'll disable Xmap, to see if the CPU load will change. Do you have recommendations for xmap alternatives?



franzpeter

The problem with xmap is, that you cannot influence crawlers when to use the xmap url. If you have a lot of categories and products and if a crawler use the sitemap url to create a sitemap, your server will be very slow or even unresponsive. There are two ways to influence that.
I am not a seo speshitpillt, maybe someone who knows about, can give better answer to that problem. So as no seo speshitpillt I ask myself: Is it necessary to put all the categories and products into the sitemap or is it better to just put the categories into the sitemap to help crawlers to better find pages. I think that a crawler, if only the categories show up in a sitemap, will crawl the links inside the category pages and should find the products too. But I do not know exactly about that. Using just the categories should not give a high CPU load.
Another way is to use a static sitemap, which you have to create first. The file is in your website directory, so a crawler can download it without call a page to dynamically create something. You need to take care about to update that file frequently.
User Pro did show a way to use CSVI for that: http://forum.virtuemart.net/index.php?topic=104612.0

PRO

EIF, post your url and we will take a look,
NORMALLY its the template

EIF

The url is: http://www.simplicitaz.nl

Yes, I do know I have some tuning to do, like defer parsing javascript (which have some issues with, but that will be a new topic ;-) ), setting leverage of caching, etc.
But this tuning does not affect the constantly cpu load, I assume, just affects page loads when visiting the website.

franzpeter

@PRO,
yes, would be helpful to see the site. But from what I can say: dynamic site maps creators on large sites, if they get called from search crawlers produce such things. Normally, your server is pretty fast, no problem. If crawlers call, as for example with xmap, the sitemap url and xmap starts to dynamically create a sitemap view, your server goes down. You cannot influence when a sitemap is called by crawlers, sometimes two or three crawlers try to call the site map at nearly the same time. Result is, that your server gets unresponsive. If for example JCrawler would work in a better way together with VM (JCrawler issue), it would be much better than xmap. JCrawler for example can save the site map as a file, not so Xmap. It starts from new on every time its called, desastrous for large pages, commerce sites a.s.o..
I have made such experience with xmap. I could find out by looking into the server access log what did provoke the problem and: it happened when crawlers did call xmap. So it was, what EIF did describe, sometimes the server was unresponsive.

But can you give me an answer to my question: Is it necessary to have deep page links (like products) inside a site map or is it sufficient to just provide the category links to guide crawlers to the places to find deeper links like products?

franzpeter

EIF,

did take a look on your page: very nice! Did take a look with page speed. It is not very slow. What you could still do to make it faster is to scale the images to reduce page load. The slider images are much bigger than necessary. The slider uses images with 205 x 175 pixels size and they are scaled with html or css. So if the page loads, the browser needs to scale down those images. But, I would say that the page is not slow. I think that xmap is the problem if called.
Maybe someone else can answer the question, if it is necessary to provide deep links to products or to just sitemap the categories to help crawlers to better find the deep links.

EIF

Thanks for looking!
Yes, I ran Pagespeed myself and saw the image slider is using too large image. I will change that. Maybe I change the whole slider, because I want to make one file of all my JS files with JCH Optimize, but whatever I do, it stops the slider. But ok, I am working on that.

I don't know if it is necessary to use deep links to al the products. I assume it's better for SEO, but if this is causing the CPU load, then I have to look for other alternatives.

PRO


htaccess code to add header expires



<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
Header set Expires "Thu, 15 Apr 2014 20:00:00 GMT"
</FilesMatch>