Ad

Fix Google Crawl 404 Errors With Laravel

I've replaced lately an old website with a new one made with laravel. Now if i check google webmaster tools I have some 404 errors due of course to different URIs. Yesterday I fixed them with a simple redirect like

    Route::get('librerie_su_misura/librerie_su_misura.php', function(){ 
    return Redirect::to('librerie-su-misura', 301);
});

But this morning when i woke up i was thinking whether it will be ok for Google, or maybe it's a better approach to load the pages on the same old paths like

    Route::get('librerie_su_misura/librerie_su_misura.php', '[email protected]');

what's the best approach according to you? Of course i would like the old routes to be deleted some day, do you think the first approach can be ok for google?

Ad

Answer

According to RFC2616, section 10.3.2 301 Moved Permanently:

The requested resource has been assigned a new permanent URI and any future references to this resource SHOULD use one of the returned URIs. Clients with link editing capabilities ought to automatically re-link references to the Request-URI to one or more of the new references returned by the server, where possible.

This is also recommended practice from Google.

Ad
source: stackoverflow.com
Ad