-
Notifications
You must be signed in to change notification settings - Fork 0
/
Developingthoughts.txt
30 lines (26 loc) · 1.7 KB
/
Developingthoughts.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
2017.05.27 Sat
1. Set up the HTTP request and create a URL to get Twitter infomation
2. Python API is just a wrapper to fill the required infomation
3. My task would be monitor the number of tweets coming from PLACE and cotaining this filter.
4. Then i will dispaly them on a map and this app would call Current Trending.
5. For later analysis, i'd better save the whole twitter json object and the raw data will be really helpul if i want to apply NLP algorithm.
6. Add a save method to the tweepy API.
7. A good trick is you can use different API and compare them to find the most convient method to achieve your goal
8. I like the pycharm which help me debug and understand the whole script.
9. Python wrapps the httpconnection into a Response object then it uses a Reader to get all the info.
Haha, i feel like i'm able to link all the parts together!
10. pprint JSON:
The json module already implements some basic pretty printing with the indent parameter:
>>> import json
>>>
>>> your_json = '["foo", {"bar":["baz", null, 1.0, 2]}]'
>>> parsed = json.loads(your_json)
>>> print json.dumps(parsed, indent=4, sort_keys=True)
11. The next question is, how do i want to save the data? How do i want to start the analysis?
We can aslo call an analysis API to do the work, WOW, so amazing.
12. I decided to follow a post and finish the project quickly and then add my idea on top of it. it will save tons of time and still boost my resume.
2017.05.28 Sun
1. I need to create a test page under Django for the JS
2. During the raw data, we need another layer to reduce the number of repeating tweets and the noises.
3. About the data retriving part, we can apply a lot of algorithms here.
4. Next step, learn how to display the real time data.