Robot Questions
Ad
How to change size of objects in webots
I am new to webots. i have to build a simulation environmant. i can add ping pong ball but can't change its size. when i want to change the size
How to remove the element present in one list from another list using robotframework
I am very new to robot framework and trying to print the value which is not common: $list1= ['test1','test2','test3','test4'] $list2=
Deny access but allow robots i.e. Google to sitemap.xml
Is there a method where you can only allow robots such as google, yahoo, or other search engine robots to my sitemap which is located at
Robots.txt and SiteMap.xml within MVC 2 app
So i have an mvc 2 website online right now. i set up my webmaster account over at google, and they want me to submit the robots.txt and
Static files in Flask - robot.txt, sitemap.xml (mod_wsgi)
Is there any clever solution to store static files in flask's application root directory. robots.txt and sitemap.xml are expected to be found in
Robot Framework connectivity with database
Connect to database psycopg2 connectiondetails ${count}= row count select count(*) from trial; log to console
How to design realtime deeplearnig application for robotics using python?
I have created a machine learning software that detects objects(duh!), processes the objects based on some computer vision parameters and then
Importing custom testing library in Robot Framework
I am writing a custom testing library for robot framework in python and i want to import it like this:
Can i verify a list of xpaths at once?
I'm new using robot framework, so i created a test case to verify that a banch of elements exists on the target web page, so my question is
How to run all the robot files inside current directory
I have robot files inside a directory i need to run all the robot file inside it in sorted order. below is my code. is it right code or
Ad
how to read csv values using robot framework
How to read particular value based on key using robotframework i am trying to read particular value from csv by passing the key
I need to fetch value from a YAML file using Robot Framework
I am trying to substitute value inside my yaml file but unable to write a proper string using robot framework. "patch_id_bw_data":
Animation only works when moving the mouse
I am using pygame and i have to adapt the following code thus the animation is also showing when the mouse is not moved (so there is no event).
How do I combine two SeleniumLibrary keywords that use the same variable into a custom keyword and call it in other keywords?
I am writing some basic automated tests and i find myself repeating the same two keywords over and over again, i wondered if there was a simpler
Why Pipeline Failed when there's test case failed
I'm running robotframework in gitlab-ci. the problem i'm facing is, if there's any test case failed on the run, it will failed the
Script raising Exception as per python's logic but Test Case still passed in robot framework
I am a new bee to robot and having issue with my script execution part . my case is failing correctly as per python's logic . however
How to add text to TextView from Loader
Now i'm studying threads and my task is to make a counter, which will add number from 0 to 9 to textview with the help
How to make Execute JavaScript of RobotFramework's keyword return a value
How to make execute javascript of robotframework return a value i need the text of a class element be returned using
Robot Framework custom keyword only in Test Setup
Is there a possibility in robot framework to allow executing a custom keyword only in test setup part (or alternatively in test
"Lighthouse was unable to download a robots.txt file" despite the file being accessible
I have a nodejs/nextjs app running at http://www.schandillia.com. the
How can an AppCompatActivity communicate with FragmentActivity using the EventBus?
Question, how can an appcompatactivity communicate with fragmentactivity using the eventbus? findings,
Ad
No subscribers registered for event class between fragments in EventBus
I'm trying to use the greenrobot eventbus between two fragments, but i have still no subscribers registered for event class. in my case i have two
How to dynamically apply variables to the values of a dictionary using Python and/or RobotFramework
Say i have a list of dictionaries: url_list = [ {'google': 'http://www.google.com/join'}, {'yahoo':
Been looking at lot of third party library codes lately and see this code which is confusing me
So here is the piece of code from eventbus getdefault() static method which returns static instance of eventbus class. /**
How to use EventBus between activities in android
In my application i have two activities. activity a and activity b. into
Is there a way to call Variables file multiple times with different arguments and have it override the previous call?
I want to create a robot keywords file for deploying different pods to kubernetes. the configuration is chosen based on a parameter passed when
How to pass default mutable arguments in robot framework?
I need to pass default mutable arguments in robot framework.in python generally we will initialize the argument to
How to assign SQL query results to a variable(Robot Framework) to input in application edit text?
Using mysql and robot framework combination along with robot database library and pymysql library. connecting database connect to
Is Robotium still under development?
Do someone knows if the great framework is still under development? last commit is 2016 and it seems not be compatibel with latest android
Angular, electron, typescript and robotjs
I'm trying to use typescript together with electron and robotjs. i'm a beginner with all of these technologies so i lack the deep understanding of
Robots.txt cannot be located
I'm currently running a website (http://www.agents-world.com/) served by a
How to pass dictionary as an argument in some method that will take all the users dynamically from the method and do the desired operations?
I struck in one problem in which i have to pass dictionary(or some other data storage data type) as an argument in some method that will take all
Ad
After applying for loop in robot framework , Error came as follows- Keyword name cannot be empty
I am applying for loops in robot framework in which i created a list of two methods. so what this loop it will traverse through the methods and
How To Generate Random Float Value in ROBOT FRAMEWORK
Actually i want a number which is random and float value for integer value i used ${impressions_int}= evaluate
How to use EventBus in two Activity on Android
In my application i want use eventbus for manage events. in my app i open activity b on the activity
RequestsLibrary support parallel execute suites with pabot?
My project using requestslibrary to do
Trying to impact RecyclerView items via EventBus
I'm trying to post an eventbus event from an recycler item view class and subscribe to it in the same class so that the event is grabbed by all
Robot Framework - How to start with pabot.PabotLib Library?
Pabot document provides an example to illustrate how to call it, just like: ***settings*** library pabot.pabotlib *** test
Robot Framework - Appium Library: Security Exception while starting the application
I want to test an android application and i have to use robot framework and appium library. since i am new to robot framework, i have created a
Connect to virtual Robot via QiMessaging JavaScript
I would like to test my javascript application locally. therefore i would like to establish a connection between my javascript application
Cannot access robots.txt file in ReactJS project
I am working on mern stack project using keystone. i am trying to access robots.txt file but it is giving me error failed
Post Requests Failed to capture the access token in Robot Framework
** settings *** library requestslibrary library collections *** variables *** ${service_root} https://xxx.yyy.org/m1/oauth
How can I serve robots.txt on an SPA using React with Firebase hosting?
I have an spa built using create-react-app and wish to have a robots.txt like this: http://example.com/robots.txt
Ad
Eventbus onMessageEvent not getting called
I have implemented eventbus in my project but i am not getting all of my events public class mainactivity extends
Is there a way to import Robot Framework resource fiile contents into Python script?
Obviously it is possible to import python robot framework library in some python script. however, is there a magic way to import robot
Robot framework keyword creation mapped to python class structure
I have just started to look at adding robot framework on top of our current pytest based system. in our test we are doing things like:
GreenRobot eventbus gradle upgrade failed with de.greenrobot:eventbus:3.1.1
When i tried to upgrade the eventbus sdk with dependancy "de.greenrobot:eventbus:3.1.1", it says couldn't resolve
Double click event using Robot class in awt package
I have already seen lot of threads regarding double click events using mouseevent. but that is not what i am looking for. i recently
Guava's EventBus - visibility of @Subscribe
Annotation from interface methods are not inherited to objects implementing that interface, afaik.
No keyword with name 'Select Window' found
*** settings *** library selenium2library *** variable *** ${handles} *** test case *** testing open browser
Multisite TYPO3 v9, distinct robots.txt for multiple domains on one rootpage
For marketing purposes do i maintain one identical website with two different domains, in typo3 v8 i would simply add a domain record on the root
java.lang.RuntimeException: It looks like you are using EventBus on Android, make sure to add the "eventbus" Android library to your dependencies
I am a fan of this eventbus library and used that on the other projects well without getting any issues. but now, i am getting some odd issue with
Conditional IF with function in condition in Robotframework with Selenium
I would like to know if someone found a way to call a function in the condition of the if. my goal is to do something like
Trying to block google scripts with apache
Someone is doing requests to my server with google scripts and i don't know how to block it. this is a part of my apache access.log:
Ad
How to click on an element in a list in robot framework
How can i select settings from this dropdown list. i tried click element by(class/id..) but none of them work for me. first i have to click
How to find upper left most corner of my Contour Bounding Box in Python OpenCV
What i'm doing: i have a robotic arm and i want to find x,y coordinates for objects on a piece of paper. i am able to
RF: Convert List of Tuple to List of List
I have fetched data from db and got this tuple. and i am using below code to convert tuple into list.but output is coming index wise. like
Screen share Pepper's tablet
How can i share the screen of pepper's tablet? i wish to show the content displayed on the tablet on a projector so that it is engaging a larger
Running a python function as keyword in robot causes infinite loop
I have a python function: def send_batch_email(recipient, batch=3, limit=50): for i in range(0, batch): for j in range(0,
robots.txt file being overridden / injected from external source?
We have a couple of wordpress sites with this same issue. they appear to have a "robots.txt" file with the following contents:
Deserialization and serialization of navigation maps
I made an exploration with my pepper and then i got the metrical representation of the map with
Why google index this?
possible duplicate: why google
Why Google robots.txt Tester has error and it's not valid
As you can see in below image google webmaster tools robots.txt tester tell me about 9 error but i don't know how to fix it and
Multiple Sitemap: entries in robots.txt?
I have been searching around using google but i can't find an answer to this question. a robots.txt file can contain the following
SEO chaos from changing robots.txt file in Wordpress site
I recently edited the robots.txt file in my site using a wordpress plugin. however, since i did this, google seems to have removed my site from
Ad
Asterisk in robots.txt
Wondering if following will work for google in robots.txt disallow: /*.action i need to exclude all urls ending with .action.
robots.txt ignrore all folders but crawl all files in root
Should i then do user-agent: * disallow: / is it as simple as that? or will that not crawl the files in the root
How can a robots.txt ignore anything with action=history in it?
I have a mediawiki, and i don't think i want google indexing the history of any page. how can a robots.txt disallow urls with
can i use robots.txt while handling my site with htaccess
I am using htaccess in my site, such that all the request to my site will be redirected to index page in my root directory. no other file in my
robots.txt configuration
I have a few doubts about this robots file. user-agent: * disallow: /administrator/ disallow: /css/ disallow: /func/ disallow:
How do I modify robots.txt in Plone?
I've got a plone site that i administer and i'd like to add some pages to the disallow of a robots.txt. it appears that plone
How to disallow search pages from robots.txt
I need to disallow http://example.com/startup?page=2 search pages from
robots.txt and wildcard at the end od disallow
I need to disallow indexing 2 pages, one of them dynamic: site.com/news.php site.com/news.php?id=__ site.com/news-all.php
how to disallow all dynamic urls robots.txt
How to disallow all dynamic urls in robots.txt disallow: /?q=admin/ disallow: /?q=aggregator/ disallow: /?q=comment/reply/
How do i configure nginx to redirect to a url for robots.txt & sitemap.xml
I am running nginx 0.6.32 as a proxy front-end for couchdb. i have my robots.txt in the database, reachable as
Ethics of robots.txt
I have a serious question. is it ever ethical to ignore the presence of a robots.txt file on a website? these are some of the considerations i've
Ad
Robots.txt block access to all https:// pages
What would the syntax be to block all access to any bots to https:// pages? i have an old site that now doesn't have an ssl and i want to block
robots.txt: Disallow bots to access a given "url depth"
I have links with this structure:
Robots.txt: allow only major SE
Is there a way to configure the robots.txt so that the site accepts visits only from google, yahoo! and msn spiders?
Anybody got any C# code to parse robots.txt and evaluate URLS against it
Short question: has anybody got any c# code to parse robots.txt and then evaluate urls against it so see if they would be excluded or
How to prevent robots.txt passing from staging env to production?
I had happen in the past that one of our it specialist will move the robots.txt from staging from production accidentally. blocking google and
robots.txt: disallow all but a select few, why not?
I've been thinking a while about disallowing every crawler except ask, google, microsoft, and yahoo! from my site. the reasoning behind
Googlebot not respecting Robots.txt
For some reason when i check on google webmaster tool's "analyze robots.txt" to see which urls are blocked by our robots.txt file, it's not what
Robots.txt to disallow everything and allow only specific parts of the site/pages. Is "allow" supported by crawlers like Ultraseek and FAST?
Just wanted to know if it is possible to disallow the whole site for crawlers and allow only specific webpages or sections? is "allow" supported
Googlebots Ignoring robots.txt?
I have a site with the following robots.txt in the root: user-agent: * disabled: / user-agent: googlebot disabled: /
Robots.txt with mutltiple domain sitemap entries
Our website has many domain names like: example.co.uk example.in example.co.eg ... so in robots.txt
How to block a certain type of urls on robots.txt or .htaccess?
Currently on my webshop, on the category pages with too many pages, the urls end by
Ad
Lighthouse false flag
I put my website through the lighthouse test via web.dev and there i am shown 2 "errors" that i can not quite comprehend.
Can you check multiple URLs in robotframework simulatenously?
I am editing my original question, hopefully this shows that i've done a bit more research i think i figured out one way to do
Googlebot not recognizing dynamic robots.txt
I have created a dynamic route with laravel that serves a txt response. it works on the browser, but googlebot says that there is no
I have disallowed everything for 10 days
Due to an update error, i put in prod a robots.txt file that was intended for a test server. result, the prod ended up with this
PrestaShop robots.txt and /module/ path indexing
Friends. i have a question regarding prestashop off-the-shelf robots.txt file. is it a normal practice to allow search engines to index
robots.txt block all except lighthouse
I have a staging site, that i want to be able to test via google lighthouse - but do not want google to index it. when i use this:
Robots.txt, php.ini, connect_to_database.php, .htaccess
I cannot seem to find an answer anywhere as to whether or not i should disallow configuration files like /php.ini or hidden files
How to tell search engines to use my updated robots.txt file?
Before, i had blocked the search engine robots to prevent crawling my website using the robots.txt file but now i want to unblock them.
How Google knows the links to my web pages because I want to create a multi-language site but the SEO stands in my way
I'm bad at english forgive me and try to understand me ** i try multi-language ** with a database (mysqli) ** but i'm having
Not understanding this robots.txt
Another company has set up the robots.txt for a site i manage. this is the code they used: user-agent: googlebot user-agent:
What's the meaning of the slash in the end of disallow path?
Here are two lines in robots.txt file: disallow: /messages disallow: /qanda/edit/ what paths is
Ad
robots.txt and disalowing absolute path URL
I am using heroku pipes. so when i push my application it is pushed to staging app https://appname.herokuapp.com/
Allowing external Javascript file to be crawled
I am facing issue with my site in google console i am getting below error in google console for my site resource :
Ad
Blog Categories
Ad