I'd like this service a lot more if it had a "last_refreshed" field. BGP/network-announcement hijacks aren't exactly common, but it'd be a useful bit of info to have in terms of determining how reliable the announcement is.
By "reliable" it sounds like you mean "legitimate"? As in, $asn is "authorized" to announce $prefix? "originAS" exists for that purpose.
Also, it's not clear what they're (ipinfo.io) using as their source for the ASN. Are they simply reporting the ASN as provided by ARIN, etc., or are they actually running BGP and reporting the origin ASN as they see it in announcements. My money would be on the former, in which case any prefix hijacking would not affect the data reported by ipinfo.io.
I don't think a "last_refreshed" date would be that helpful, though. Netblocks aren't being shuffled around very often. I just looked at a previous employer's assignments and it was almost a decade ago that it was last updated. It's still 100% accurate, however.
Regardless, if you want it, that data is available from the RIRs. Go crazy.
That works for one lookup. The service I'm building on top of pyasn uses zeromq and can do 100,000+ lookups/sec
There's often a disconnect between something like that which works for one address, and something I could actually use to do bulk lookups on 5,000,000 addresses to generate reports.
But ip geolocation usually gives you city or zipcode level accuracy, which may or may not be good enough to get an accurate uber estimate depending on how big your city/zip area is.
I just quickly hacked together this query which pulls out all amazon URLs in post answers:
SELECT REGEXP_EXTRACT(body, r'[^a-z](http[a-z\:\-\_0-9\/\.]+amazon[a-z\:\-\_0-9\/\.]*)[^a-z]') AS link, COUNT(1)
FROM [bigquery-public-data:stackoverflow.posts_answers]
GROUP BY 1 ORDER BY 2 DESC LIMIT 20
It takes 5 seconds to run - over ALL stackoverflow answers!
This is great news. The reason we support IPv6 lookups but only over IPv4 connections on https://ipinfo.io is because we use AWS with VPC, which hasn't historically supported IPv6. Except many sites/services to add IPv6 now that were previously limited by this.
I just came across https://ipinfo.io, great service. I actually use it when hyperlinking IP addresses for a quick way to view GeoIP information. Thanks!
Just wanted to echo node's comment above - I regularly use ipinfo.io and find it very useful. I'm sure the plethora of other 'IP lookup' sites have very similar data, but your design and layout makes all the difference. Thank you!
> Each of David Filo, Eddy Hartenstein, Richard Hill, Marissa Mayer, Jane Shaw and Maynard Webb has indicated that he or she intends to resign from the Board effective upon the Closing, and that his or her intention to resign is not due to any disagreement with the Company on any matter relating to the Company’s operations, policies or practices.
Which indicates that she'll resign from the board, but doesn't say anything about her resigning as CEO. Is that just left off here because it doesn't require SEC disclosure, or is it a possibility that she'll stay?
If she stays, it should be with Verizon Yahoo, not with Altaba (shell containing Alibaba shares formerly known as Yahoo), which is a kind of company Mayer has no experience managing (and also not a managerial challenge worth the kind of salary she can get elsewhere)
Funny, I signed up for your $10/month plan just yesterday. I found out about you from a StackOverflow answer you posted and thought to tell you that you should answer other questions like "get location from IP address {{ language }}". In particular there was one for Python I didn't see your service listed for, and it took me quite a while to find your service which I'm very pleased with so far.
Ahhhh... Geo IP stuff. That's the reason that I can't get a lot of the local channels on streaming apps. :-P
I'm actually n hour north of Dallas, but pretty much all of the Geo IP products show me as being out in east Texas, usually Mount Pleasant or Longview. That's 150 miles from where I am.
As a result, I get streams coming from Shreveport, LA instead of Dallas, TX.
Congrats!
One thing the post didn't answer is where you got the data (IMO, the most difficult part for this project). Are you using MaxMind or something else behind the scenes?
Congrats. I'm a bit confused as to why you got a warning email from Linode. I'm a customer, and as far as I know you can use 100% of the CPU that's assigned to your virtual machine without a problem. Were they informing you that you need to use less CPU or were they just suggesting that you might want to upgrade?
Also, why didn't you just expand to a more capable Linode or add another Linode? I've found their transfer to cost a small fraction of what AWS charges. I would think your operating costs would be less with Linode.
You can use as much CPU as you like - it was just a configured alert. I thought they were enabled by default, but perhaps not (you can turn them on at https://manager.linode.com/linodes/settings/).
I did initially add additional capacity at Linode, but eventually outgrew that. It'd been a long journey from the original Linode VPS to the current setup :)
Thanks for reporting the back button issue. I've also seen this occasionally, but haven't seriously looked into it. Is this Chrome, or another browser?
> if I understand this correctly, ipinfo is basically a lookup in a db? What is the advantage of this over a local lookup, say with maxmind or similar?
There are 2 parts to that.
1) What's the advantage of using an geolocation API over a local database?
It's simpler. There's no need to download a database, or to remember to update it. You can call it from anywhere.
2) Why use ipinfo.io over other geolocation APIs?
The main 2 reasons are speed and reliability.
i) Reliability - we have multiple servers in auto-scaling groups all around the globe with auto-fail-over, and an excellent uptime record
ii) Speed - our API is designed to be extremely fast. We have servers on both US coasts, Germany and Singapore with geoDNS to route your request to the closest servers to reduce latency even further
yeah its chrome Version 54.0.2840.98 (64-bit) on osx
Im sorry but setting up a cronjob to download maxmind + include jars into project seems easier and faster than incorporating a third party web service.
edit: about speed, since you call such a database yourself most likely, you are not gonna be faster than local lookup
This comment reminds me of the response Dropbox received when being introduced to ycombinator.
"...you can already build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem."
Cool thanks - I'll look into what might be causing the back button behavior.
> Im sorry but setting up a cronjob to download maxmind + include jars into project seems easier and faster than incorporating a third party web service.
Sure, if you've got the required sysadmin and dev skills, and a server to host the file. Not everyone does.
We also return additional data beyond geolocation, such as the ASN and hostname, and have additional optional fields such as company name and domain, and carrier details. You could download multiple databases and do it locally, but it's even more effort.
> about speed, since you call such a database yourself most likely, you are not gonna be faster than local lookup
Oh sure - it's not quicker than a local lookup - it's quicker than _other_ IP geolocation APIs.