15,000+ fake views on ChallengePost Project

Hey there!

I hope you enjoyed my previous posts in Technical Hacks category.

Starting with the project of the week – GitHub Profile Scraper. So, as I was done with my latest project GitHub Profile Scraper a few days ago, I decided to post it on ChallengePost. GitHub profile scraper basically, helps you to get details of someone’s GitHub profile and show the information in an elegant way!

As soon as I posted the project on ChallengePost, it really got kind of sudden attention from ChallengePost community and even the co-founder of ChallengePost liked my project and started following me. I was delighted by the immediate attention that I was getting for this (self-)project.

As said by the great men – “The more you get, the more you desire to have!”. Yes, I was not different in this sense. I started looking for more and more visitors to my project and more followers on ChallengePost.

Then an evil idea came in my mind! I thought of automating the project page views as I noticed that they count different page views from same IP address as different page views. This reminded me of how easily this is achievable using Python. So, I opened my weapons i.e. Python IDLE! 😀

I wrote a simple python script within next 10 minutes and ran various versions(keeping time delays different) of it. The result was an increment of almost two thousand page views. My project was soon trending on top on ChallegePost. Soon after that, the co-founder Nealrs came to know about this(probably) and stopped liking the project. But I kept on running the script with increased frequency of the page visits. At a time I was sending almost 20 page requests every seconds. The only thing I was waiting for was a server crash, which didn’t eventually happen and page views on the project were soon 17,000+. Yes, an increase of almost 15,000 page views.

The best part was yet to come! Soon, the other co-founders of ChallengePost came to know about this and instead of taking any action against me or my project(I was actually expecting them to remove my project after this), one of the co-founders, Brandon Kessler (@bkessler) tweeted me something like this –

This absolutely made my day! Then, I replied back with a polite, humble tweet to assure that I will not continue that.

FYI, the project was trending on ChallengePost for almost 3 days, which is really a big thing. Today, it has more than 20,000 page views. 😉

You can view the project on ChallengePost here: GitHub Profile Scraper | ChallengePost

For feedback/suggestions/feature-requests, tweet me @sahildua2305 #FakeViews

DISCLAIMER: Since, I am making this trick public, ChallengePost staff may have problem with it. So, I am removing link to the script. It’s simple enough to write it yourself. 😉

Advertisements

Top 40 useful websites to learn new skills

Source: http://www.marcandangel.com/2010/05/24/top-40-useful-sites-to-learn-new-skills/

“Here are the top 40 sites I have personally used over the last few years when I want to learn something new.

 

Hack a Day – Hack a Day serves up fresh hacks (short tutorials) every day from around the web and one in-depth ‘How-To hack’ guide each week.

 

eHow – eHow is an online community dedicated to providing visitors the ability to research, share, and discuss solutions and tips for completing day-to-day tasks and projects.

 

Wired How-To Wiki – Collaborate with Wired editors and help them build their extensive library of projects, hacks, tricks and tips.  Browse through hundreds how-to articles and then add to them, or start a new one.

 

MAKE Magazine – Brings the do-it-yourself (DIY) mindset to all of the technology in your life.  MAKE is loaded with cool DIY projects that help you make the most of the technology you already own.

 

50 Things Everyone Should Know How To Do – While not totally comprehensive, here is a list of 50 things everyone should know how to do.  It’s a great starting point to learn new skills.

 

wikiHow – A user based collaboration to build and share the world’s largest, highest quality how-to manual.

 

Lifehacker – An award-winning daily blog that features tips, shortcuts, and downloads that help you get things done smarter and more efficiently.

 

100+ Google Tricks That Will Save You Time – Today, knowing how to use Google effectively is a vital skill.  This list links out to enough Google related resources to make you an elite Google hacker.

 

Instructables – Similar to MAKE, Instructables is a web-based documentation platform where passionate people share what they do and how they do it, and learn from and collaborate with others as the tackle new projects and learn new skills.

 

Merriam-Webster Online – In this digital age, your ability to communicate with written English is paramount skill.  And M-W.com is the perfect resource to improve your English now.

 

Lumosity – Learn to improve your memory by playing a series of fun and educational brain training games.

 

100 Skills Every Man Should Know – Another compilation article with instructions to help you learn new skills.  This one says it’s geared for men, but I think most of these skills are applicable to women as well.

 

5min Life Videopedia – Lot’s of great tutorials and DIY videos………”

Read More…

List of 101 most useful websites

Originally posted at: Monster Hub

List of 101 most useful websites:

01. screenr.com – record movies of your desktop and send them
straight to YouTube.

02. bounceapp.com – for capturing full length screenshots of web
pages.

03. goo.gl – shorten long URLs and convert URLs into QR codes.

04. untiny.me – find the original URLs that’s hiding behind a short
URLs.

05. localti.me – know more than just the local time of a city

06. copypastecharacter.com – copy-paste special characters that
aren’t on your keyboard.

07. topsy.com – a better search engine for twitter.

08. fb.me/AppStore – search iOS apps without launching iTunes.

09. iconfinder.com – the best place to find icons of all sizes.

10. office.com – download templates, clipart and images for your
Office documents.

11. woorank.com – everything you wanted to know about a website.

12. virustotal.com – scan any suspicious file or email attachment for
viruses.

13. wolframalpha.com – gets answers directly without searching
– see more wolfram tips.

14. printwhatyoulike.com – print web pages without the clutter.

15. joliprint.com – reformats news articles and blog content as a
newspaper.

16. isnsfw.com – when you wish to share a NSFW page but with a
warning.

17. e.ggtimer.com – a simple online timer for your daily needs.

18. coralcdn.org – if a site is down due to heavy traffic, try
accessing it through coral CDN.

19. random.org – pick random numbers, flip coins, and more.

20. mywot.com – check the trust level of any website – example.

21. viewer.zoho.com – Preview PDFs and Presentations directly in the
browser.

22. tubemogul.com – simultaneously upload videos to YouTube and other
video sites.

23. truveo.com – the best place for searching web videos.

24. scr.im – share you email address online without worrying about
spam.

25. spypig.com – now get read receipts for your email.

26. sizeasy.com – visualize and compare the size of any product.

27. whatfontis.com – quickly determine the font name from an image.

28. fontsquirrel.com – a good collection of fonts – free for personal
and commercial use.

29. regex.info – find data hidden in your photographs – see more EXIF
tools.

30. tineye.com – this is like an online version of Google Googles.

31. iwantmyname.com – helps you search domains across all TLDs.

32. tabbloid.com – your favorite blogs delivered as PDFs.

33. join.me – share you screen with anyone over the web.

34. onlineocr.net – recognize text from scanned PDFs and images – see
other OCR tools.

35. flightstats.com – Track flight status at airports worldwide.

36. wetransfer.com – for sharing really big files online.

37. pastebin.com – a temporary online clipboard for your text and
code snippets.

38. polishmywriting.com – check your writing for spelling or
grammatical errors.

39. awesomehighlighter.com – easily highlight the important parts of
a web page.

40. typewith.me – work on the same document with multiple people.

41. whichdateworks.com – planning an event? find a date that works
for all.

42. everytimezone.com – a less confusing view of the world time
zones.

43. warrick.cs.odu.edu – you’ll need this when your bookmarked web
pages are deleted.

44. gtmetrix.com – the perfect tool for measuring your site
performance online.

45. imo.im – chat with your buddies on Skype, Facebook, Google Talk,
etc. from one place.

46. translate.google.com – translate web pages, PDFs and Office
documents.

47. youtube.com/leanback – sit back and enjoy YouTube videos in
full-screen mode.

48. similarsites.com – discover new sites that are similar to what
you like already.

49. wordle.net – quick summarize long pieces of text with tag clouds.

50. bubbl.us – create mind-maps, brainstorm ideas in the browser.

51. kuler.adobe.com – get color ideas, also extract colors from
photographs.

52. followupthen.com – setup quick reminders via email itself.

53. lmgtfy.com – when your friends are too lazy to use Google on
their own.

54. tempalias.com – generate temporary email aliases, better than
disposable email.

55. pdfescape.com – lets you can quickly edit PDFs in the browser
itself.

56. faxzero.com – send an online fax for free – see more fax
services.

57. feedmyinbox.com – get RSS feeds as an email newsletter.

58. isendr.com – transfer files without uploading to a server.

59. tinychat.com – setup a private chat room in micro-seconds.

60. privnote.com – create text notes that will self-destruct after
being read.

61. flightaware.com – live flight tracking service for airports
worldwide.

62. boxoh.com – track the status of any shipment on Google Maps –
alternative.

63. chipin.com – when you need to raise funds online for an event or
a cause.

64. downforeveryoneorjustme.com – is your favourite site really
offline?

65. example.com – this website can be used as an example in
documentation.

66. whoishostingthis.com – find the web host of any website.

67. google.com/history – found something on Google but can’t remember
it now?

68. errorlevelanalysis.com – find whether a photo is real or a
photoshopped one.

69. google.com/dictionary – get word meanings, pronunciations and
usage examples.

70. urbandictionary.com – find definitions of slangs and informal
words.

71. seatguru.com – consult this site before choosing a seat for your
next flight.

72. sxc.hu – download stock images absolutely free.

73. imo.im – chat with your buddies on Skype, Facebook, Google Talk,
etc. from one place.

74. wobzip.org – unzip your compressed files online.

75. vocaroo.com – record your voice with a click.   The 101
Most Useful Websites  4

76. scribblemaps.com – create custom Google Maps easily.

77. buzzfeed.com – never miss another Internet meme or viral video.

78. alertful.com – quickly setup email reminders for important
events.

79. encrypted.google.com – prevent your ISP and boss from reading
your search queries.

80. formspring.me – you can ask or answer personal questions here.

81. snopes.com – find if that email offer you received is real or
just another scam.

82. typingweb.com – master touch-typing with these practice sessions.

83. mailvu.com – send video emails to anyone using your web cam.

84. ge.tt – quickly send a file to someone, they can even preview it
before downloading.

85. timerime.com – create timelines with audio, video and images.

86. stupeflix.com – make a movie out of your images, audio and video
clips.

87. aviary.com/myna – an online audio editor that lets you record and
remix audio clip.

88. noteflight.com – print music sheets, write your own music online
(review).

89. disposablewebpage.com – create a temporary web page that
self-destruct.

90. namemytune.com – when you need to find the name of a song.

91. homestyler.com – design from scratch or re-model your home in 3d.

92. snapask.com – use email on your phone to find sports scores, read
Wikipedia, etc.

93. teuxdeux.com – a beautiful to-do app that resembles a paper
dairy.

94. livestream.com – broadcast events live over the web, including
your desktop screen.

95. bing.com/images – automatically find perfectly-sized wallpapers
for mobiles.

96. historio.us – preserve complete web pages with all the
formatting.

97. dabbleboard.com – your virtual whiteboard.

98. whisperbot.com – send an email without using your own account.

99. sumopaint.com – an excellent layer-based online image editor.

100. lovelycharts.com – create flowcharts, network diagrams,
sitemaps, etc.

101. nutshellmail.com – Get your Facebook and Twitter streams in your
inbox

My timepass when SPOJ server was down!

Hey there!

I hope you enjoyed my last post about How I managed to download profile pictures of my facebook friends.

So, just like other normal days, when I am extremely into solving SPOJ problems one after another. And suddenly this happened!

There is a problem with SPOJ and their judge is down. Hence we are not able to process your submissions. We are looking into this.

CodeChef (@codechef) September 6, 2014

 

spoj

All of the websites based on SPOJ(CodeChef, Ideone etc.) were down consequently. I had just submitted my solution to a problem on SPOJ and I was yet to see the results of my submission. But it was down for long time enough to make me bore.

I was continuously refreshing the SPOJ website to know its status. Since, like all other programmers, I am also very lazy and believe in automating most of the boring tasks of life. I thought of automating this task as well!

Here is how I did it:

import requests
import winsound
from time import sleepcount=0
while(1):
response = requests.get(‘http://spoj.com/ ‘)
count+=1
if len(response.content) != 231:
winsound.Beep(200,5000)
print “Number of times checked: “+str(count)
break
print “Number of times checked: “+str(count)
sleep(10)

 

This was the script that I wrote in just 15 minutes and automated the task of checking whether SPOJ website was up or not yet.

How does this work:

It keeps making HTTP requests to the SPOJ website and returns the response. Since SPOJ website wasn’t down on server so the HTTP request to website wasn’t returning any error code. So, it was really hard to detect whether site is up or not. So, what I did was – noted down the length of the return content when the website was down. It was just 231 chars down(which mainly included the html code for their error message). So using this number, I put a check on the number of chars in return content of HTTP call that whenever the number of chars in this response will be more than 231, it will make my system beep for the given number of seconds(milliseconds actually) once the website is UP.

Here is the Github Repo where I have committed the entire code that I used to serve this purpose. You can fork it and play around it as you want.

For feedback/suggestions/feature-requests, tweet me @sahildua2305 #SPOJtimepass

This is how I downloaded profile pics of my Facebook friends using Graph API

Hey there !

A few days ago, my friend Hasil wrote a python script to post “Thank you” comments on B’day posts automaticlly using Facebook Graph API. It suddenly arose my interest in Graph API. And, after having developed applications(small/mid level) based on more than 8 APIs till the moment, it was just another API that I was going to work with; But the most complicated one I have ever worked with. So let me share my experience with you all, hope you guys will enjoy it, I surely did enjoy learning and using it.

What is Graph API?

Facebook Graph API

Graph API is HTTP based API(Application Processing Interface), that allows developers to send GET and POST requests to Facebook servers for data exchanging.

HTTP is a standard web protocol that is used all over the world and most of web applications for fetching and sending data to a remote server, of which GET and POST are two defined ways to work with data.

Graph API is HTTP based and hence helps us in fetching,GET request and posting, POST request from Facebook servers really easily without going into complex process underneath and any programming language which has support for these two requests can be used. But Python is my choice of language for some obvious reasons 😉

For identifying these GET/POST requests uniquely for each developer, Facebook uses an access token and all the requests are uniquely identified by Facebook server using this token only. We will shortly get to know how to get the unique access token. So let’s get started.

Getting Acess Token:

As mentioned earlier, everytime we send request(GET/POST) to Facebook, we need to use access token for Facebook servers to identify us as a unique developer. Generating an access token is really easy. Just follow the following simple steps:

  • Open Graph API explorer and click on Graph API Explorer. Now, click on Get Access Token and tick all the options that come under your set of requirement from Graph API. (PS: you can tick all the options including the ones in Extended Permissions)
  • A really long random code will be generated and that’s our access token which is required to work with this amazing Graph API.

Fetching Details of Friends with Script:

For making HTTP requests to Facebook servers, I am going to use Python module Requests. It makes our life really easy(one of the reasons that Python is the language of my choice).

import requests
import json
token = ‘ ‘ # paste the access token that you got in last step, here.
api_url = “https://graph.facebook.com/v2.1/
params = {‘access_token’ : token}
call = “me/friends?fields=picture.width(9999).height(9999).type(large),gender,name”
response = requests.get(api_url + call, params=params)
r = (json.loads(response.content))

And the output will be some JSON encoded object returning the information about your friends. Now you just need to read the JSON object and use information about each of your friends one by one.

Downloading Profile Pictures from the information extracted:

Now, that we have extracted all the information about friends, next job will be to perform download action on profile picture of everyone. Just create a folder in the main directory where your python script is saved, named ‘images’.

import urllib2
for f in r[‘data’]:
p_url = str(f[‘picture’][‘data’][‘url’])
urlopener1 = urllib2.build_opener()
page1 = opener1.open(p_url)
my_picture = page1.read()filename = f[‘name’]+”_”+f[‘id’]+”.jpg”
print filename+” downloaded…”
fout = open(‘images/’+filename, “wb”)
fout.write(my_picture)
fout.close()

Now you can check your images folder. It will have all the downloaded profile pictures of your friends.

That’s it. It may seem to be complicated at first, but you can open Graph API explorer at any point of time and understand how things are working.

Here is the Github Repo where I have committed the entire code that I used to download profile pics of my facebook friends. You can fork it and play around it as you want.

For feedback/suggestions/feature-requests, tweet me @sahildua2305 #FBtrick