Tag: Web search engine

  • Dangers Of Google Search Health Conditions

    Recently I came across two instances where Google search becomes dangerous.

     

    In one instance, a Girl of about, settled in the US, was eating Dumpling in a Restaurant with her colleagues.

     

    The dish was quite hot.

     

    TMJ jpeg
    TMJ condition

     

    While engaged in conversation with her friends, shed did not notice it was hot and popped it her mouth.

     

    It was so hot that as a reflex action swallowed it instead of spitting it out.

     

    The piece got stuck in her throat, she could not speak,swallow, her face turned bluish and she could not breathe.

     

    People around her did not notice it, but some one near by noticed it and gave her a slap in her back and she swallowed it.

     

    Otherwise she would have choked herself to death.

     

    Air Marshall Mukherjee of Indian Air force died this way aboard an aircraft this way.

     

    But this post is not about the dangers of eating with out some basic precautions, while engaged in conversation.

     

    ( Hinduism forbids speech/conversion while eating).

     

    The girl being educated a web addict searched Google for her condition, symptoms.

     

    She found a lot of cases where the writers have expressed their problems of not being able to touch food for months because of Fear.

     

    The girl got scared and abstained from eating for a week, when some one told her harshly to eat and ate normally.

     

    In another instance, a girl searched Google for pain in the cheek, Jaws and got landed with people writing about symptoms of TMJ, where some one has written that  she has the condition for twenty years, in incurable pain and had been under Streiod(?) medication , which has complicated the issues by adding Insomnia to the list.

     

    And this girl has started doing the rounds with the doctors , starting from Dentist,ENT specialist, Neurologist,Neurosurgeon and took FBC ( Full Blood Count) Test,urine Culture, Sputum and Stools.

     

    All the parameters were normal.

     

    The girl started imagining she has some disease hat would stay with her for Life and she would lose her sanity!

     

    She did not sleep for two days, not more than 5 Hours a day.

     

    She now imagines she has Insomnia and wanted to meet with a Psychiatric Counselor.

     

    Having studied Psychology and known what this is about I forbid her.

     

    I scolded her very rudely , counseled her and she is alright now.

     

    Lessons.

     

    1.Medicine is an evolving Science, if not exact.

     

    Doctors proceed from Symptoms to disease and not the other way around.

     

    One symptom may be indicative of any disease, some of them serious, some worth ignoring.

     

    By trial and error or experience or both, Doctors diagnose.

     

    It is foolish to replace a Doctor with Internet search.

     

    Still worse is to reading fro among the first twenty or so results in Google search.

     

    I find people normally taking the first two or three results of Google search as Gospel.

     

    Nothing can be more wrong.

     

    Authentic information need not appear in the first results.

     

    The first page search results are determined not by so much by the  correctness of the information  but by keywords, the search term used, and then the number of times the article has been read.

     

    If the initial query has been incorrect, the site has used the correct key words though wrong information, the results will appear in the first twenty results.

     

    Therefore the first results in Google search need not be correct.

     

    Again unless you know the fundamentals of medicine do not read internet, whatever the site says it is.

     

    And remember our Body is the best system in the world to rectify itself.

     

    Do not brood over illnesses, consult a Doctor , have faith in him and follow his advice.

     

    Take tie in finding out a general physician who does not prescribe you medicine at the drop of a hat, or who sends you for a lot of tests even for a small thing.

     

    And stick with him.

     

    Such a doctor need not be expensive.

     

    I go to a doctor who is at least 50 years old, who checks me up physically, allows my body to heal without resorting to heavy dosage and one who is cal and not in a hurry.

     

     

     

     

     

  • Google Reads All Emails Justifies Rightly

    Google has admitted to reading  all the Email contents in , what it terms as to ‘to provide you personally relevant product features, such as customised

    search results, tailored advertising, and spam and malware detection. This analysis occurs as the content is sent, received, and when it is stored.”

     

    Google reads emails as part of its service
    Google says the change in its policy allowing the company to trawl through emails will give “people even greater clarity”. Photo: AP

     

    Privacy goes through the Window!

     

    But it may be added that if you need personalized recommendations from the Service provider, I do not think there is any other option but to agree to what Google says.

    Google is right.

    One can not have the cake and eat it too

     

    Google Inc updated its terms of service on Monday, informing users that their incoming and outgoing emails are automatically analyzed by software to create targeted ads.

    The revisions more explicitly spell out the manner in which Google software scans users’ emails, both when messages are stored on Google’s servers and when they are in transit, a controversial practice that has been at the heart of litigation.

    Last month, a U.S. judge decided not to combine several lawsuits that accused Google of violating the privacy rights of hundreds of millions of email users into a single class action.

    Users of Google’s Gmail email service have accused the company of violating federal and state privacy and wiretapping laws by scanning their messages so it could compile secret profiles and target advertising. Google has argued that users implicitly consented to its activity, recognizing it as part of the email delivery process.

    Google’s updated terms of service added a paragraph stating that “our automated systems analyze your content (including emails) to provide you personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection. This analysis occurs as the content is sent, received, and when it is stored.”

    A Google representative did not immediately respond to a request for comment.

     

    Source .Huffington Post.

     

    Enhanced by Zemanta
  • Deep Web,The Dark Side Of Internet Details

    Deep Web,The Dark Side Of Internet Details

    All of us know Internet.

    *Scroll down for Videos.

    The Deep Web Search browser.
    What is a Tor? How to preserve the anonymity?
    Tor is the acronym of “The onion router”, a system implemented to enable online anonymity. Tor client software routes Internet traffic through a worldwide volunteer network of servers hiding user’s information eluding any activities of monitoring.
    As usually happen, the project was born in military sector, sponsored the US Naval Research Laboratory and from 2004 to 2005 it was supported by the Electronic Frontier Foundation.
    Actually the software is under development and maintenance of Tor Project. A user that navigate using Tor it’s difficult to trace ensuring his privacy because the data are encrypted multiple times passing through nodes, Tor relays, of the network.
    Connecting to the Tor network
    Imagine a typical scenario where Alice desire to be connected with Bob using the Tor network. Let’s see step by step how it is possible.
    She makes an unencrypted connection to a centralized directory server containing the addresses of Tor nodes. After receiving the address list from the directory server the Tor client software will connect to a random node (the entry node), through an encrypted connection. The entry node would make an encrypted connection to a random second node which would in turn do the same to connect to a random third Tor node. The process goes on until it involves a node (exit node) connected to the destination.
    Read more: http://thehackernews.com/2012/05/what-is-deep-web-first-trip-into-abyss.html#ixzz2ipKPzWY8
    Follow us: @TheHackersNews on Twitter | TheHackerNews on Facebook

    We are able to access them and the information is Indexed by Search Engines.

    We can access the information by relevant queries.

    This is a part of World Wide Web, www.

     

    There is another side to the world wide web where you can not normally access the information , but is still a part of the world wide web.

    This is called the Deepnet, the Invisible Web, the Undernet or the hidden Web.

    Then there is the dark Internet, the computers that can no longer be reached via Internet, or with a Darknet distributed filesharing network, which could be classified as a smaller part of the Deep Web.

    Mike Bergman, founder of BrightPlanet coined the phrase.

    He explained searching on the Internet today can be compared to dragging a net across the surface of the ocean: a great deal may be caught in the net, but there is a wealth of information that is deep and therefore missed.

    Most of the Web’s information is buried far down on dynamically generated sites, and standard search engines do not find it.

    Traditional search engines cannot “see” or retrieve content in the deep Web—those pages do not exist until they are created dynamically as the result of a specific search. As of 2001, the deep Web was several orders of magnitude larger than the surface Web.

    What the Deep Web is generally used for?

    1.For Drug Sales.

    2.For Hiring Contract Killers.

    3.Seeking Contract Killers.

    3.Sexual perversions.

    4.Drug Trafficking, Money Transfers.

    5.Child Trafficking.

    6.Human Trafficking.

    7.Mercenaries  recruitment and Advertisement.

    8,Also being used by some Intelligence Agencies for Dark Operations.

    Story:

    Hiring a hitman has never been easier. Nor has purchasing cocaine or heroin, nor even viewing horrific child pornography.

    Such purchases are now so easy, in fact, that they can all be done from the comfort of one’s home at the click of a button… and there’s almost nothing the police can do about it.

    This worrying development of the criminal black market is down entirely to the Deep Web – a seething matrix of encrypted websites – also known as Tor – that allows users to surf beneath the everyday internet with complete anonymity.

    And like The Silk Road, transactions are all made using the mysterious online currency Bitcoin. One site, whose name MailOnline has chosen not to publish, offers an assassination in the US or Canada for $10,000 and one in Europe for $12,000.

    ‘I do not know anything about you, you do not know anything about me,’ crows one self-styled assassin, according to The Daily Dot. ‘The desired victim will pass away. No one will ever know why or who did this. On top of that I always give my best to make it look like an accident or suicide.’

    Deepweb is buried in the Internet where prohibited activities take place.
    Ad in the Deep Web-Contract Killers.
    Mercenaries advertise in the Deep Web.
    DeepWeb advertisement by Mercenaries.

    THE DEEP WEB: WHAT IS TOR?

    Tor – short for The onion Router – is a seething matrix of encrypted websites that allows users to surf beneath the everyday internet with complete anonymity.

    It uses numerous layers of security and encryption to render users anonymous online.

    Normally, file sharing and internet browsing activity can be tracked by law enforcement through each user’s unique IP address that can be traced back to an individual computer.

    The Tor network on the Deep Web hides the IP address and the activity of the user.

    Most of the Web’s information is buried far down on dynamically generated sites, unable to be found or seen by traditional search engines – sites or pages don’t exist until created as the result of a specific search.

    An Internet search is like dragging a net across the surface of the sea – a great deal of information is caught, but a majority is deep and therefore missed.

    ‘I have gained endless experience(s) in this [sic] 7 years,’ he goes on. ‘It has changed me a lot. I don’t have any empathy for humans anymore.

    ‘This makes me the perfect professional for taking care of your problems and makes me better than other hitmen. If you pay enough I’ll do ANYTHING to the desired victim. If I say anything I mean anything.’

    Many of the sites even use slogans and marketing techniques that, if it weren’t for their macabre subject matter, could be as at home on the website of a legitimate retail website.

    ‘The best place to put your problems is in a grave,’ boasts one.

    Some even seem to offer others the chance to profit from their killing by allowing users to bet on when a victim will die by putting money in a pool. The closest guess takes home the pot.

    And while many appear every inch the cold-blooded killer one would expect from a gun-for-hire, there is also apparently the odd humanitarian hitman.

    ‘Killing is in most cases wrong, yes,’ writes one. ‘However, as this is an inevitable direction in the technological evolution, I would rather see it in the hands of me than somebody else.’

    ‘By providing it cheaply and accurately I hope that more immoral alternatives won’t be profitable or trusted enough. This should primarily be a tool for retribution.’

    Adding that murder should always be committed for ‘good reason’, he writes: ‘Bad reasons include doctors for performing abortions and Justin Bieber for making annoying music.

    How To Surf The Deepweb.

    You’ll need a browser named Tor. Open that up and get a new identity around every few minutes. The rest is up to you. I’d recommend checking out the Evil Wiki and learning about onion sites and seeing if you can find some links. That’s what I did the other night at least and found a bunch of weird shit. Like this one dude was selling “sex dolls” He cuts the legs and arms off children and abuses the shit out of them so they don’t feel pain. He pulls their teeth out so they can’t bite your you know what and so much more, so if that’s what interest you then go ahead, but honestly there isn’t really any reason to go to the “deep web”.

    Has The Deep-web Closed after busting of the Silk Road by the Deep Web?

    In an interesting post-mortem release by the creators of the defunct anonymous marketplaceAtlantis there is information that the former admins and users of the Silk Road are planning to resurrect the service. User RR writes: “We have SilkRoad v2.0 ready to launch and is now in its final testing stages. Our site has all the features of the original one and we have kept the same style of forum for your ease.”

    The new SilkRoad will be sending out anonymous invites to former vendors and then open to the Tor-using public soon after.

    The representatives of Atlantis write:

    From a quick scout around I’ve counted at least 5 publicly stated projects with the said aim of replacing becoming “Silk Road 2.0″ and many many more gathering info and building alliances.
    And this is what Law Enforcement is now parading as a victory? Over two years of investigation, millions of dollars spent and for what so a couple of armchair programmers can build it again in a few days while in the meantime vendors simply move to other site’s .

    Users are already planning ways to keep the new site secure. This includes the creation of something called BitWasp, an “open source, anonymous bitcoin marketplace specifically built for use in conjunction with Tor or I2P via the hidden services such as .onion websites and eepsites.”

    Sources:

    http://techcrunch.com/2013/10/04/deep-web-users-are-ready-to-launch-silk-road-2-0/

    http://www.reddit.com/r/deepweb/comments/1o7uaf/new_to_the_deepweb/

    http://www.dailymail.co.uk/news/article-2454735/The-disturbing-world-Deep-Web-contract-killers-drug-dealers-ply-trade-internet.html

    http://en.wikipedia.org/wiki/Deep_Web

    Image credit.http://unpromisedone.blogspot.in/2011/09/information-about-deep-web.html

    the above url has no API key

  • Four Years In WordPress 1.3 Millions

    This blog has completed four years in WordPress.

    Though I had an idea that I would be completing four years, I did not think of it much as I was not really bothered about writing about it.

    However I received a notification from WordPress that this milestone(?) may be blogged as the blog has some interesting features.

    Following are the salient points.

    Fourth Anniversary in WordPress.
    Fourth Anniversary in WordPress.

    Number of Followers 1447.

    Likes                                 1337.

    Categories                         117.

    Tags                                16.683(Did I tag that much?)

    Posts                                5748

    Hits                                 13,43,820

    Comments                          3450

    Maximum Hits               29803

    recorded in a day

    Average Hit/day                1832

    This is the statistical part of it.

    What is more important to me is that I came across very good posts in my search for writing;some of them are found not on the first twenty results of Google search, but are very good in terms of content.

    This reaffirms my belief that good articles need not be restricted to the first page results alone.

    Probably, these people are also like me, want to share information yet do not know(as I do) the nuances of getting a place in the first Page search results.

    Many of  my posts do find a place in the first results though.

    Good content is always appreciated, more importantly useful.

    This I have learnt from the feedback for my posts.

    People do write to me to elaborate on a particular subject, which I have blogged about :and in many a case suggest topics.

    This is the part I enjoy the most.

    I do get criticisms ;fortunately most of them are creative, guiding me towards clarity and research.

    Now most of my posts are on the basis of what the readers want and about what I am confident of writing.

    I do research.

    More than the readers I am happy I can acquire information.

    I had an initial perception that writing about a popular topic shall drive traffic to your site.

    Now I find that content drives.

    Yet writing on popular topics gets you the initial readers and on their recommendations you get established.

    Then one can write to engage the readers by writing what they want and about what one is confident about.

    I have received queries from some bloggers that they have a personal blog and in their opinion they are not getting the Hits as they should, in their opinion.

    My answer is that if your content is useful and interesting, Hits will come, though it might take some time. none can say how long.

    Some of posts dating back to 2009 are Hits now.

    Thank God, I did not delete them;on the contrary, I update them frequently.

    Updating a Post is important.

    Another reason why some personal blogs do not Get Hits is the fact that people are not interested in your woes and happiness, unless it is dramatic.

    Expression of personal emotions alone will not guarantee you Hits or readers.

    It is the general message that others can learn from such instances, with adequate proof or sources will interest the readers.

    As usual I have not planned for this post and I have written what has come to my mind off the cuff.

    I shall update as and when I get ideas.

    And Thank You,

    Readers,

    WordPress.

  • Make Your Blog Easy To Read For Search Engines

    GoogleBot-byFML
    http://commons.wikimedia.org/wiki/File:GoogleBot-byFML.gif

    Goolglebot

    Sometimes one has to hunt down for good articles in the Internet, as in Google Search,Bing.

    The content may be excellent authentic, yet it does not appear with out very specific search.

    While content, post headings,Categories, tags are important, one might as well understand howt he search engines like ,Bing, Google read and reach your site.

    As the volume of internet material is very huge, people are not  used.

    Instead,Search Engines use bots*

    These are the tools that  locate your site.

    What do they do?

    1.The bot will first look into your file names.

    The file must be bot  friendly.

    This is for people who write on HTML format.

    Check with qualified people on this subject for more.

    If you are writing on WordPress or blog spot, they will take care.

    2.Once this is over, the bot moves onto Meta descriptions.

    This is nothing but a Note, preferably in short words,about what the site contain/ is about..

    It is recommended that the Meta descriptions do not exceed 60 characters.

    However the search engines have recently sopped this step as there was Spamming.

    However it is better to have neat and short description of what you plan to say in your site.

    3.the bots move to the actual content that is everything that you find between section in your webpage, just be aware that if you are using any frames, tables in your content area then bots might not crawl through them and bots have lower capacity to crawl through javascript and flash over HTML so its better to have a webpage that is designed with HTML and other programs over flash and javascript.

    4.The bot , then checks for duplicate content.

    If your contents are duplicate to some other contents on the web then there is a chance that your rankings for that particular page will be low or it will be included in the supplementary index.

    Crawling

    Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.

    We use a huge set of computers to fetch (or “crawl”) billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider). Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.

    Google’s crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.

    Google doesn’t accept payment to crawl a site more frequently, and we keep the search side of our business separate from our revenue-generating AdWords service.

    Indexing

    Googlebot processes each of the pages it crawls in order to compile a massive index of all the words it sees and their location on each page. In addition, we process information included in key content tags and attributes, such as Title tags and ALT attributes. Googlebot can process many, but not all, content types. For example, we cannot process the content of some rich media files or dynamic pages.

    Serving results

    Source:

    https://support.google.com/webmasters/answer/70897?hl=en#1

    *

    Internet bots, also known as web robotsWWW robots or simply bots, are software applications that run automated tasks over the Internet. Typically, bots perform tasks that are both simple and structurally repetitive, at a much higher rate than would be possible for a human alone. The largest use of bots is in web spidering, in which an automated script fetches, analyses and files information from web servers at many times the speed of a human. Each server can have a file called robots.txt, containing rules for the spidering of that server that the bot is supposed to obey or be removed.

    In addition to their uses outlined above, bots may also be implemented where a response speed faster than that of humans is required (e.g., gaming bots and auction-site robots) or less commonly in situations where the emulation of human activity is required, for example chat bots.

    Bots are also being used as organization and content access applications for media delivery. Webot.com is one recent example of utilizing bots to deliver personal media across the web from multiple sources. In this case the bots track content updates on host computers and deliver live streaming access to a browser based logged in user.(wiki)