I've always been interested in developing a web search engine. what's a good place to start? i've heard of lucene, but i'm not a big java guy.
I want to create dynamic content based on this. i know it's somewhere, as web analytics engines can get this data to determine how people got to
I'm building a django project that needs search functionality, and until there's a django.contrib.search, i have to choose a search
What do search engine bots use as a starting point? is it dns look-up or do they start with some fixed list of well-know sites? any guesses or
Is there a way to search the latest version of every file in tfs for a specific string or regex? this is probably the only thing i miss from
I notice that stackoverflow has a views count for each question and that these view numbers are fairly low and accurate. i have a similar
I have a table named product and i have a product in this table named "nike shoes". now when i write for example "niksho" or "nsh", return to me
I'm using rails with the tire gem (for elasticsearch) and i need to search across
I'm trying to create something that will have strong seo results. i have two options to display the menu's for the restaurant on the website:
Why google webmaster tools don't see the static version of my site but instead the template for the dynamic one?
I have added the spiderable package package to my meteor app, and the html
I'm reviewing a couple of my web sites to make sure my seo bases are covered. there are no private pages on the site in question, and we want all
We have developed a vue app with support for different languages. for such, we use the dictionaries of i18n. also, on
As i know reactjs render at the client-side means when i fetch data from the api server need to wait until change title and meta tags. so does
I have generated a sitemap from online generators, it seems to be working and even i tested it on old google search console sitemap testor and it
I've just started learning structured data and i'm still trying to wrap my head around the concept. first i started out with microdata
Sometimes, before launching new web projects, i put the site / app under a subdomain like new.domain.com or beta.domain.com. these urls are only meant for my
I've about 100 pages of on my website which i don't want to be indexed in google...is there any way to block it using robots.txt..it'd be very tiresome to edit each page
I have a website which i have recently started and also submitted my sitemap on google webmaster tool. my site got index whiten short time but whenever i search about my
I am making a website using angularjs, i am curious to know that is there any disadvantage of hash in url with respect to seo ? e.g.
So if i google stackoverflow, i get the following result (with a search bar):
I have a php site wich contains 1000s of pages ... every day i delete 10s of pages which already indexed by google.... when visitor come to one of thos pages i do
So basically i have a site where occasionally i setup a "vanity" url for web campaigns or product literature: i.e.