Skip to content

Commit

Permalink
added timestamp recording
Browse files Browse the repository at this point in the history
  • Loading branch information
lan-party committed Sep 28, 2024
1 parent 8b408a4 commit 63efcc4
Show file tree
Hide file tree
Showing 3 changed files with 11 additions and 3 deletions.
9 changes: 8 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,4 +23,11 @@ Or edit it first to change some settings:
## Notes
Using [Shodan](https://www.shodan.io), you can find ip addresses to seed the web crawler with and potentially reveal similar devices. Gather a list of addresses using Shodan's available search filters, convert them to netblock abbreviations, then add those to the unscanned_netblocks.txt file with a new line between each. Netblocks can be abbreviated in the following way: `111.111.111.` which is equivalent to the CIDR notation `111.111.111.0/24`.

Some other public databases include [ZoomEye](https://www.zoomeye.hk/) and [Censys](https://search.censys.io/).
Some other public databases include [ZoomEye](https://www.zoomeye.hk/) and [Censys](https://search.censys.io/).

## To Do Next
- https check
- path checking
- timestamp recording
- crawler gui
- search engine web ui
2 changes: 1 addition & 1 deletion found_targets.tsv
Original file line number Diff line number Diff line change
@@ -1 +1 @@
address title page hash country region city ISP dork matches
address title page hash country region city ISP dork matches timestamp
3 changes: 2 additions & 1 deletion spiderdork.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
import json
import time
import threading
import datetime

# Netblock queues
unscanned_netblocks_file = open("unscanned_netblocks.txt", "r")
Expand Down Expand Up @@ -106,7 +107,7 @@ def save_addresses(addresses):
city = resp['city']
isp = resp['isp']

append_content += "\nhttp://" + address[0] + "\t" + str(title.encode("utf-8"))[2:-1] + "\t" + page_hash + "\t" + country + "\t" + region + "\t" + str(city.encode("utf-8"))[2:-1] + "\t" + str(isp.encode("utf-8"))[2:-1] + "\t" + json.dumps(address[2])
append_content += "\nhttp://" + address[0] + "\t" + str(title.encode("utf-8"))[2:-1] + "\t" + page_hash + "\t" + country + "\t" + region + "\t" + str(city.encode("utf-8"))[2:-1] + "\t" + str(isp.encode("utf-8"))[2:-1] + "\t" + json.dumps(address[2]) + "\t" + datetime.datetime.now().strftime("%m/%d/%Y")
# Append to file
save_file = open("found_targets.tsv", "a")
save_file.write(append_content)
Expand Down

0 comments on commit 63efcc4

Please sign in to comment.