Browse the web, fetch API data, and more on your Flipper Zero.

Requirements

  • WiFi Developer Board, Raspberry Pi, or ESP32 device flashed with FlipperHTTP version 1.6 or higher: https://github.com/jblanked/FlipperHTTP
  • 2.4 GHz WiFi Access Point

Installation

  • Download from the Official Flipper App Store: https://lab.flipper.net/apps/web_crawler
  • Video tutorial: https://www.youtube.com/watch?v=lkui2Smckq4

Features

  • Configurable Request: Specify the URL of the website you want to send a HTTP request to or download (tested up to 427Mb)
  • Wi-Fi Configuration: Enter your Wi-Fi SSID and password to enable network communication.
  • File Management: Automatically saves and manages received data on the device's storage, allowing users to view, rename, and delete the received data at any time.

Usage

  1. Connection: After installing the app, turn off your Flipper, connect the WiFi Dev Board, then turn your Flipper back on.

  2. Launch the Web Crawler App: Navigate to the Apps menu on your Flipper Zero, select GPIO, then scroll down and select Web Crawler.

  3. Main Menu: Upon launching, you'll see a submenu containing the following options:

  4. Run: Initiate the HTTP request.
  5. About: View information about the Web Crawler app.
  6. Settings: Set up parameters or perform file operations.

  7. Settings: Select Settings and navigate to WiFi Settings. Use the Flipper Zero's navigation buttons to input and confirm your settings. Do the same for the Request settings. Once configured, these settings will be saved and used for subsequent HTTP request operations.

For testing purposes: - https://httpbin.org/get Returns GET data. - https://httpbin.org/post Returns POST data. - https://httpbin.org/put Returns PUT data. - https://httpbin.org/delete Returns DELETE data. - https://httpbin.org/bytes/1024 Returns BYTES data (DOWNLOAD method) - https://proof.ovh.net/files/1Mb.dat Returns BYTES data (DOWNLOAD method - it can download the whole file) - https://proof.ovh.net/files/10Mb.dat Returns BYTES data (DOWNLOAD method - it can download the whole file)

  1. Running the Request: Select Run from the main submenu to start the HTTP request process. The app will:
  2. Send Request: Transmit the HTTP request via serial to the WiFi Dev Board.
  3. Receive Data: Listen for incoming data.
  4. Store Data: Save the received data to the device's storage for later retrieval.
  5. Log: Display detailed analysis of the operation status on the screen.

  6. Accessing Received Data: After the HTTP request operation completes, you can access the received data by either:

  7. Navigating to File Settings and selecting Read File (preferred method)
  8. Connecting to Flipper and opening the SD/apps_data/web_crawler_app/ storage directory to access the received_data.txt file (or the file name/type customized in the settings).

Setting Up Parameters

  1. Path (URL)
  2. Enter the complete URL of the website you intend to crawl (e.g., https://www.example.com/).

  3. HTTP Method

  4. Choose between GET, POST, DELETE, PUT, DOWNLOAD, and BROWSE.

  5. Headers

  6. Add your required headers to be used in your HTTP requests

  7. Payload

  8. Type in the JSON content to be sent with your POST or PUT requests.

  9. SSID (WiFi Network)

  10. Provide the name of your WiFi network to enable the Flipper Zero to communicate over the network.

  11. Password (WiFi Network)

  12. Input the corresponding password for your WiFi network.

  13. Set File Type

  14. Enter your desired file extension. After saving, the app will rename your file, applying the new extension.

  15. Rename File

  16. Provide your desired file name. After saving, the app will rename your file with the new name.

Happy Crawling! 🕷️

App Version SDK Status Downloads Logs Build
1.0.1 86.0 f7 Build succeeded 6969 get logs get build
1.0.1 85.0 f7 Build succeeded 81 get logs get build
1.0.1 79.2 f7 Build succeeded 11556 get logs get build
1.0.1 78.1 f7 Build succeeded 2075 get logs get build
1.0.1 73.0 f7 Build succeeded 328 get logs get build
1.0 79.2 f7 Build succeeded 262 get logs get build
1.0 78.1 f7 Build succeeded 2166 get logs get build
1.0 73.0 f7 Build succeeded 68 get logs get build
1.0 72.1 f7 Build succeeded 393 get logs get build
0.8 78.1 f7 Build succeeded 8809 get logs get build
0.8 73.0 f7 Build succeeded 488 get logs get build
0.8 72.1 f7 Build succeeded 403 get logs get build
0.7 78.1 f7 Build succeeded 3208 get logs get build
0.7 73.0 f7 Build succeeded 596 get logs get build
0.7 72.1 f7 Build succeeded 422 get logs get build
0.6 78.1 f7 Build succeeded 301 get logs get build
0.6 73.0 f7 Build succeeded 1351 get logs get build
0.6 72.1 f7 Build succeeded 411 get logs get build
0.5 78.1 f7 Build succeeded 164 get logs get build
0.5 77.2 f7 Build succeeded 859 get logs get build
0.5 73.0 f7 Build succeeded 3027 get logs get build
0.5 72.1 f7 Build succeeded 875 get logs get build
0.4 77.2 f7 Build succeeded 132 get logs get build
0.4 73.0 f7 Build succeeded 1514 get logs get build
0.4 72.1 f7 Build succeeded 379 get logs get build
0.4 69.0 f7 Build succeeded 291 get logs get build
0.3 73.0 f7 Build succeeded 2031 get logs get build
0.3 72.1 f7 Build succeeded 428 get logs get build
0.3 69.0 f7 Build succeeded 54 get logs get build
0.2 73.0 f7 Build succeeded 283 get logs get build
0.2 72.1 f7 Build succeeded 69 get logs get build
0.2 69.0 f7 Build succeeded 21 get logs get build
0.1 73.0 f7 Build succeeded 1447 get logs get build
0.1 72.1 f7 Build succeeded 361 get logs get build
0.1 69.0 f7 Build succeeded 46 get logs get build
web_crawler Latest 1.0.1 GitHub Author: JBlanked