Difference between UI vs UX Bug?

Lesson Learned from Software Testing:

Many of us know, UX stands for User Experience and UI stands for User Interface.

Lets see an example:

In most of the websites we see a search engine.










Now, without entering any keywords in the search box and click on Search Button.

We don't see "No Error/Warning message". We feel that is a UI bug.

But, it has been worked as designed - to "re-enter and click on search button". Hence, the issue may be rejected as it is not UI Bug.

And a tester raises an Issue - There should be warning message "Enter a Search Keyword".
It is a UX Bug.

Posted in | Leave a comment Location: Mysore, Karnataka, India

Why would websites have robots.txt ?

Few days ago, a new concept came landed on my desk - ROBOTS.TXT and I haven't heard about it before. I have asked many of my friends/colleagues.

Thanks to my guide - Thomas, who gave me a clear thoughts about robots.txt. 
Here is my learning notes from it.
Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Web site owners use the /robots.txttext file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.
This file is placed on web server under root folder and advises spiders and other robots which directories or files they should/ should not access.
Example: www.yourwebsite.com/robots.txt
What does it do exactly?When a keyword is searched, first thing web spiders visits is robots.txt file. It looks in the file to know what it should do - based on the instructions mentioned in robots.txt
Why robots.txt file?

1. Most people want robots to visit all the web pages and content in their website. 
           How to make this work?1. Don't have robots.txt file in the web server2. Add empty robots.txt file without any instructions3. Add robots.txt file with the following instructions:User-agent: *Disallow:Here, * - means any web robots.
Disallow -  tells the robots what folders they should not look at in website.
* List of various web robotswww.robotstxt.org/db.html

2. Few people want robots to restrict some files in the web server and few files allows to visit.User-agent: * Disallow: /checkoutAllow: /imagesThe "Allow:" instructions tells web robot, that it is okay to see a files in images folder.

How to test robots.txt file?
1. We can test robots.txt with the Google Webmaster tool - robots.txt Tester2.  To find out if an individual page is blocked by robots.txt you can use this online tool which will tell you if a page is blocked or not.

Posted in | Leave a comment Location: Mysore, Karnataka, India

Problem/Solution: Tab Order not working in Safari Mac OS

I came across a situation while testing on Safari Browser - Mac OS where, On Selection of Tab on the fields - It is not focusing on all fields in the form.

Initially, I thought it is a Bug in the application. But, I thought if there are any settings need to be looked over. As I was new to testing on Mac OS Platform.



Found that Tab Navigation is disabled by default in Mac OS Safari Web Browser and on click on Tab on keyboard, it doesn’t interact with each element on the page.

Solution:
But there is an option to "Turn ON" - In Safari Browser -> Preferences -> Advanced -> enables tab to all fields in order to enable basic tab navigation.



Hope all customers/users know it, before they raise this as issue.


PS: Does anyone know the reason for having tab navigation turned off in Safari by default?

Posted in | 1 Comment Location: Hyderabad, Telangana, India