AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

AI + internet = ?
 
 

So, I have an AI of fair ability in processing english language and voice commands, and even do computer task automation. Now I found this program, Curl, which is a simple exe that can send POST and GET requests to the internet and so retrieve webpage contents or submit online forms. By activating Curl.exe with e.g. a system() call, the AI that inhabits my computer can read websites, send emails, activate my php scripts, etc, but I am lost for ideas on what to do with it.

What do you have your chatbot / AI / virtual assistant / home automation do with access to the resources of the internet?

P.S. I don’t order pizzas online: Too expensive compared to supermarket prices.

 

 
  [ # 1 ]

IFTTT (If This Then That), might give you some ideas.
https://ifttt.com/recipes

Most people try to automate their regular tasks. Use your AI to keep a log then automate repetitive things.
I use the AI for searches and news. You can also hook it up to twitter or other social media.

 

 
  [ # 2 ]

I use it to monitor stock prices, calculate individual stock value vectors, and then execute automated micro-trades based on the analysis.

Needless to say I will soon be a multi-gazillionaire!

 

 

 
  [ # 3 ]

If you can get back web pages and your chatbot is fairly good at english processing then can it scan the content and extract knowledge that can be saved for use in future conversations? In other words can it learn simple facts from the content it gets back from the web? For example, maybe it could scan for any occurance of the word “is” and “learn” x isa y relations and then use those to answer questions later.  I know people are trying to procedurally and manually scan and process large volumes of data like wikipedia and store them in a database and then have their chatbot or ai use these facts but i think your approach is more interesting where the chatbot/ai itself can just browse the internet and “read” what it is interested in while you are chatting with it or while it is “thinking”. smile

I am interested in how the information from web pages can be parsed and how the semantics can be represented and stored.  For your AIs knowledge do you write/create and store your patterns/topics/facts/responses in your own custom format? or have you ever experimented with triple stores, rdf, OWL or other standardized interchangeable formats? Or is your AI using a database with your own custom records? 

It would be interesting to be able to have your ai be able to search news topics that interest you and then have a conversation with you.  AI: ” I was browsing the Internet and saw that New Horizons snapped an awesome pic of demoted planet Pluto on its one day flyby after travling 9 years.  After it flushes its memory it will carry uploaded crowd-sourced selfies to the Aliens and Predators of the galaxy.  Would you consider uploading your selfie or not so much?  “

Seriously, truly customizable news would be a great feature of using Curl to retrieve web content.

 

 
  [ # 4 ]
Alaric Schenck - Jul 16, 2015:

Seriously, truly customizable news would be a great feature of using Curl to retrieve web content.

Why reinvent the wheel- its called “Google News”, try it some time wink 

Of course, if you are brave and have some time, you should try Curl WITH Google… hopefully that will get you thinking about how to leverage Curl (hint- for instance, try recursive Googling of links from Google).

 

 
  [ # 5 ]
Alaric Schenck - Jul 16, 2015:

the chatbot/ai itself can just browse the internet and “read” what it is interested in while you are chatting with it or while it is “thinking”.

This is precisely what Laybia AI does already (using Curl, recursive googling, lots of parsing and statistical analysis)

Alaric Schenck - Jul 16, 2015:

It would be interesting to be able to have your ai be able to search news topics that interest you and then have a conversation with you.

This part of Laybia AI is still a work in progress, but it does function somewhat like that for things like “do you like x”, or “what is your opinion of y”.  Mostly using info from scraping to fill in the [who what where when how mood] as related to the original input.

 

 

 
  [ # 6 ]

IFTTT is a good tip, Merlin, thanks. It seems most I do is program AI and check a few fora. Maybe I can have the AI analyse the chatbots forum daily newsletter (after some mail-to-php service) and let me know if interesting topics are posted, so I can reduce my online idle time.

Carl B - Jul 16, 2015:

I use it to monitor stock prices, calculate individual stock value vectors, and then execute automated micro-trades

Really? Because that’s a good idea. In terms of (legal) money-making, other options seem to be filling in paid surveys, translation jobs, or writing a blog. The latter is quite fascinating: Have an AI have its own regular blog. I’d probably have to do a bit (severe understatement) more work in the language generation department, and find sites that offer good source material.

Alaric, you may find a few answers in my old topic. I built my own custom database of fact triples, stored as words and numbers in text files, divided over folders per subject. I’ve never worked with other ontologies. I don’t often agree with their categorisation, hierarchy, or reliability, but they’re not all bad either.
I parse text files in the same way as it parses user sentences, so yes the AI can extract and learn facts by reading webpages that Curl downloads to file, but it’ll also need to filter out html tags, wikipedia formatting, etc. As it is only “fairly good” though, often 30% of the learned facts will be misunderstandings that muck up the database. A solution would be a second database that only stores internet facts temporarily. It’s a good suggestion that the AI could “educate” itself on a topic of conversation, and given that this’ll take several seconds the best time to do that would be while the user is typing.

”I was browsing the Internet and saw that New Horizons snapped an awesome pic of demoted planet Pluto on its one day flyby after travling 9 years.

This could be arrived at by retrieving a title of a news article on a topic of interest, but making AI judge the visual quality of pictures might be difficult unless one throws a neural net at it.

After it flushes its memory it will carry uploaded crowd-sourced selfies to the Aliens and Predators of the galaxy.

This will take document summarisation AI to determine which facts or sentences are most interesting. Would be possible -ish. I was thinking of having the AI summarise my emails like that.

“Would you consider uploading your selfie or not so much?“

I can’t even begin to consider what systems of planning, prediction and psychology it would take an AI to make an appropriate suggestion. Long-term project, maybe smile

 

 
  [ # 7 ]
Don Patrick - Jul 16, 2015:
Carl B - Jul 16, 2015:

I use it to monitor stock prices, calculate individual stock value vectors, and then execute automated micro-trades

Really?

Oh yeah

 

 
  [ # 8 ]
Don Patrick - Jul 16, 2015:

but it’ll also need to filter out html tags, wikipedia formatting, etc.

You can get a fairly structured version from any wikipedia page. Example:

https://en.wikipedia.org/wiki/Special:Export/Artificial_intelligence

Info:

https://en.wikipedia.org/wiki/Help:Export

 

 
  [ # 9 ]

I was about to ask, thanks smile

 

 
  [ # 10 ]

I found an even better way to extract the plain text from Wikipedia using their online api. Saves me a lot of unnecessary work:
https://en.wikipedia.org/w/api.php?format=json&action=query&prop=extracts&explaintext;&redirects;=true&exintro;&titles;=robots
I’m going to try further and see if I can’t set up a “summarise this page for me” command. The hard part is getting the program to know what website I’m looking at.

The coolest thing I’ve done so far is establish online communication between my phone and home AI through php. Now all I need is an iPhone, a SmartWatch with voice recognition and a black Sedan to relive the 80’s.

I’ve also looked into geolocation from IP, but it’s not accurate to a useful degree and it may be just as easy to just launch google maps with some parameters to figure out where to go. Same with weather info, that windows 8 already offers at the press of a button. Maybe I should just get Siri or Cortana.

 

 
  [ # 11 ]

Okay, so I succeeded in making a ridiculously roundabout procedure to get my AI to summarise a website I’m looking at. Goes like this:

Windows Speech Recognises “Download this page” while browser is in foreground,
Windows Speech Macros passes the command on to an exe,
The exe passes the command on to the AI;
Simulate F6 keypress (selects browser address bar);
Simulate ctrl-C (copies url);
Curl-download the url found in the clipboard;
Read the downloaded file;
Summarise to a local html file;
Launch the html file in the still open browser.

Plenty difficult and some sites like i-programmer still give me a 404 when Curl somehow omits a symbol from the requested url. I learned that filtering out html tags wasn’t as easy as just skipping the tags themselves: At javascript tags, header tags or menu navigation tags, you’ll also want to skip the text inbetween the tags, but not at others. And even then it’s not always clear whether words between div and span tags are part of the menu or main text.
Then there are html character codes to convert, and worse, special symbols that can’t be read as ASCII characters, often in blogs composed with Microsoft Word.

Anyway, should it interest others, here is a minimalist way I found to get Google search results:

https://ajax.googleapis.com/ajax/services/search/web?v=1.0&rsz=4&as_qdr=y&q=robots
rsz specifies the amount of results to getwith a maximum of 8 per request.
as_qdr=y restricts the search to pages updated within a yearother values are dwmdayweekmonth

The returned result contains several urls and a value for “estimatedResultCount”, which you could use for instance to estimate the commonness of two keywords coinciding, probably good for a whole bunch of things. I’d love to hear if you’ve found uses for it.

 

 
  [ # 12 ]

There’s a universally available utility which you can use to normalise HTML into say XHTML or XML.

https://en.wikipedia.org/wiki/HTML_Tidy

Having done that you can then use XSLT via another utility such as xsltproc to easily process it into the form you want. These two utilities make web scraping as straightforward as it’s ever likely to get, but it’s still a dirty business.

 

 
  [ # 13 ]

Thanks, Andrew! That may solve my remaining troubles.

 

 
  [ # 14 ]

Well, I finally programmed one internet function that I’ll actually be using: Custom tv guide notifications grin. I found a back door into an online tv guide’s programming interface, have curl download the entire listing, and compare each program’s title to a list of my favourite keywords. It then schedules all programs that match in my local AI’s ‘reminder’ system, which has it auto-wake my computer from hibernation mode and tell me that Doctor Who is on right now. It really saves me a lot of clicking and ad-noiances. Whether it saves me time is arguable however: I did spend 6 hours programming it.

With all the json / xml / html / etc formats and all the programming languages required to deal with them, I’ve instead programmed a small c++ function that searches alphanumerical sequences (i.e. words, values or tag names) in any file format, regarding symbols as their edges. e.g. if I want the weather from yahoo’s XML API, I do gettag(“yweather:condition”), gettag(“text”) and then capture the next-first word, which is e.g. “Cloudy”. If anyone wants I can post the code of it. Alternatively I’d have to do all of this processing online, which my AI isn’t.

I also learned how to instantly launch Google maps and/or Google streetview at my exact street location. There’s a similar service at instant streetview. The first numbers in the url are the latitude and longtitude, which you can retrieve to a less accurate degree from sites like [url]http://ipinfo.io[url]. So my AI can now tell its location, but I don’t have a use for it that Google apps haven’t already covered. I suppose I could have it do things like count the number of windows on a certain building at a certain location using computer vision on streetview, but I can’t really see that come in handy smile.

 

 
  [ # 15 ]

I made another one of those “virtual assistant” functions that I’ll never actually use.

Image Attachments
emailarckon.jpg
 

 1 2 > 
1 of 2
 
  login or register to react