Get started with the Python ebay SDK and use your API keyset to make automated queries to the 'Merchandising' and 'Finding' API. The EBAY documentation is quite detailed and so vast that there are many examples, but until you see them used in a current video you never know if the code still works. So this is a short introduction to help any of you who are starting out with coding and using the EBAY API.
To get the API keyset aka APPID you can follow along to my other video which shows the whole process from registering for a a developer account to actually getting your first request working in the API example.
How to register to use EBAY API and test it: ---------------------------------------------------------------------- https://youtu.be/i9A3zvuMWNc
? 10% off ScraperAPI : https://www.scraperapi.com?fpr=ken49 ◼️ Coupon Code: DRPI10 (You can also get started with 5000 free API calls. No credit card required.)
Proxies ================================================= If you need a good, easy to use proxy, I was recommended this one, and having used ScraperAPI for a while I can vouch for them. If you were going to sign up anyway, then maybe you would be kind enough to use the link and the coupon code below?
You can also do a full working trial first as well, (unlike some other companies). The trial doesn't ask for any payment details either so all good! ?
Unpacking my SBR16 linear rails and bearings for my DIY CNC project.
The motion on 3 of the 4 bearings is not great. I bought them from eBay so won't be getting any more from that supplier, and will be upgrading to a NEMA 23 motor with more current required, which means the Adafruit HAT will not be suitable.
Pity, because one of the four bearings is perfectly smooth, but I have read elsewhere that NEMA 23 is the way to go for CNC. (17 is ok for 3d printing).
If all else fails might buy one of these to experiment with! https://amzn.to/2KZcKEF
Automate the boring stuff : https://amzn.to/2MEEmlK
#CNC # RaspberryPi
Check out the Minimalist online python IDE :
https://epyco.herokuapp.com/
https://epico.herokuapp.com/
Buy Dr Pi a Coffee...or Tea! : https://www.buymeacoffee.com/DrPi
...
https://www.youtube.com/watch?v=KdmfCVa0WJ4
If you are web scraping with Scrapy, you may want to scrape many categories, but not just scrape all links with a crawler.
If you can find a sitemap, in JSON format, you can flatten the structure, with its lists and dictionaries and then make a new list to use for your URLS or form query string parameters for your URLs to scrape.
Sound like hard work? Not really, 8 lines of code inside a function and off you go. Just print "type" regularly to check what type you are iterating through...
Timings:
0:00 Intro - About sitemaps
4:05 - Start the code
19:00 - Using slice to get 'code' and 'name'
Any questions, add a comment, I'll be pleased to reply!
Dr Pi.
#webscraping #json #sitemap
...
https://www.youtube.com/watch?v=Fzscc5gOECw
This video covers how top protect a route in Flask - as part of a new series which will cover authentication of various types and implementations. I write a very small Flask app and test it with a separate file, using requests.
Become a patron : ? https://www.patreon.com/drpi
Buy me a coffee (or Tea) ☕ https://www.buymeacoffee.com/DrPi
If you want a fast VPS server with Python installed check out :
https://webdock.io/en?maff=wdaff--170
Thumbs up yeah? (cos Algos..)
Check out : https://findthatbit.com
#flask #tutorial #pythonprogramming
...
https://www.youtube.com/watch?v=4abHEvWwWPM
A look at the Cisco sandbox API labs and check out some of the code to make GET and POST requests via an instance of the class provided.
I discuss how to connect to the lab using anyconnect, how to send a request to a device, and show some output from a vManage SD WAN device.
Proxies
=================================================
If you need a good, easy to use proxy, I was recommended this one, and having used ScraperAPI for a while I can vouch for them. If you were going to sign up anyway, then maybe you would be kind enough to use the link and the coupon code below?
You can also do a full working trial first as well, (unlike some other companies). The trial doesn't ask for any payment details either so all good! ?
? 10% off ScraperAPI : https://www.scraperapi.com?fpr=ken49
◼️ Coupon Code: DRPI10
(You can also get started with 1000 free API calls. No credit card required.)
Thumbs up yeah? (cos Algos..)
-------------------------------------------------------------------------
You can buy bitcoin here ???
? https://www.swanbitcoin.com/python360
+ get $10 in BTC when you join Swan !!
--------------------------------------------------------------------------
? Become a patron ?
? https://www.patreon.com/drpi
Buy Dr Pi a coffee (or Tea)
☕ https://www.buymeacoffee.com/DrPi
#CiscoDevNet #NetworkAutomation #python
...
https://www.youtube.com/watch?v=JSG2Dcu-gJ4
Some sites are very strict with restrictions on requests which make it very difficult or not viable to scrape large lists of results, especially if you are getting error 403 or 405. If you are web scraping at scale then you will at some point need to use a proxy.
A paid proxy is one option to allow you to make requests, using requests, and get what you need.
? https://www.scraperapi.com/?fp_ref=dr-pi
? https://pypi.org/project/scraperapi-sdk/
Promo Code:
SCRAPE1462937
pip install scraperapi-sdk
And then just 3 lines of code:
✏️from scraper_api import ScraperAPIClient
✏️client = ScraperAPIClient("4e3d10c2650ed52c85a41c7xxxxxxx") # use your own api key (register for trial)
✏️response = client.get(url)
You still need to be sensible, use the correct user agent, headers etc.
After trialling a workaround using a bash script to change my VPN server to the next one in a list of 92 I decided to trial "Scraper API" as well.
Here you can see how I modified the code, and used my trial quota to get some results from a property listings site.
I tried other such services from other companies but found this one to be the best. I am NOT being paid to say this by the way!
Whilst these are not cheap, one paid job a month should pay for this.
I hope this has been informative and I'd like to thank you for viewing!
? Thumbs up yeah? ?
This will eventually run on my Raspberry Pi 4 and if you want to see the code, it's on GitHub. It's still work in progress, but works 100%.
https://github.com/RGGH/BeautifulSoup
Thanks to Code Monkey King for his tips on proxies, and his top videos.
See you around yeah?
Dr P.
#webscraping #python #proxy
...
https://www.youtube.com/watch?v=ILnkt1ASpGg
Scraping or crawling a large site can generate a large amount of data to store. To avoid duplicating data it can be better to use multiple tables in a database.
Here I show a very simple way to use conditional logic with Python code in a Scrapy spider to send data to different tables based upone the values being scraped. This is intended as an overview for anyone who has already looked at web scraping to a sqlite3 database or similar.
? The perils of using CSV (eg Excel) are also discussed!
The aim of this video is to give you the inspiration to explore the possibilities of using a database, multiple tables, and web scrape in a more structured way, and consider what will be the eventual purpose of your data, and how it might be used/interrogated.
The code below is work-in-progress, but it works. You will need to have already established a connection to your database, and created the tables. See my previous videos in the playlist : https://www.youtube.com/playlist?list=PLKMY3XNPiQ7sp76LfY81dgpZRdaNi4nnM
If this has been of interest, the next step will be to learn about SQL, and creating tables.
Thanks for watching.
Dr Pi.
'Illegitimi non carborundum'
def parse_item(self, response):
wsearch = ['train','murder']
lstitle = []
x = response.url
booktitle = str(x)
booktitle = booktitle.split("/")[4]
booktitle = booktitle.split("_")[0]
[lstitle.append(i) for i in booktitle.split("-")]
check = any(item in wsearch for item in lstitle)
if check is True:
myquery = """INSERT INTO links (url,book)
values(%s,%s)
"""
val=(x,booktitle)
cursor.execute(myquery,val)
mydb.commit()
else:
myquery = """INSERT INTO links2 (url,book)
values(%s,%s)
"""
val=(x,booktitle)
cursor.execute(myquery,val)
mydb.commit()
#webscraping #sqlite #database
...
https://www.youtube.com/watch?v=NeK04g2xKxY
In this tutorial, we'll guide you through the process of creating your very own video summarizer using a free Huggingface BART transformer model and streamlit.
Video summarization is an essential technique that extracts key information from videos, saving time and effort. We'll be using Python and a Huggingface BART encoder-decoder model with the pipeline class. By the end of this video, you'll have a working free video summarizer that you can use to generate concise summaries from any video you feed it. Join us and unlock the potential of video summarization with Huggingface!
The model I used was "sshleifer/distilbart-cnn-12-6"" but you can choose whichever you suits you best from huggingface.co
Support my side project here : https://ko-fi.com/flightviz
Become a patron : ? https://www.patreon.com/drpi
Buy me a coffee (or Tea) ☕ https://www.buymeacoffee.com/DrPi
If you want a fast VPS server with Python installed check out :
https://webdock.io/en?maff=wdaff--170
Pytest with FastAPI course on "TestDriven.io":
-------------------------------------------------------------------------
https://testdriven.io/courses/tdd-fastapi/?utm_source=python360
https://testdriven.io/courses/scalable-fastapi-aws/?utm_source=python360
https://testdriven.io/courses/fastapi-celery/?utm_source=python360
Thumbs up yeah? (cos Algos..)
#VideoTranscript #HuggingFace #pythonprogramming
...
https://www.youtube.com/watch?v=YD-wagrJjhU
Python on Pi - parsing the csv file to create config to go on the Cisco ASA.
Microsoft's public ip range csv file is parsed by my python script, with a for loop, "network objects" are created to correct format to load straight onto the ASA via the command line.
#Cisco #ASA #Python #Office365 #Whitelist #Parse #CSV
Check out the Minimalist online python IDE :
https://epyco.herokuapp.com/
https://epico.herokuapp.com/
Buy Dr Pi a Coffee...or Tea! : https://www.buymeacoffee.com/DrPi
...
https://www.youtube.com/watch?v=nEF1cOUZ580
Converting my project into a single Docker image using "Supervisor" along with Apache and Uvicorn for 1 self contained image. This is an "MVP" at this stage but it works 100% and Authentication would be the next task. Rather than use Flask, whatI have done here is create a Docker image with FastAPI and a HMTL/JavaScript front end that will run in a Single container.
Note: this if for MVP / Demo purposes, in production you would use separate containers and a Docker network: https://docs.docker.com/network/bridge/
aka : How to expose 2 ports in docker : Apache (html & js), Uvicorn (FastAPI)
--------------------------------------------------------------------------------------------------------------------------
? If you would like to, maybe become a patron on Patreon? ?
? https://www.patreon.com/drpi
--------------------------------------------------------------------------------------------------------------------------
-- chapters --
00:00 intro
03:05 supervisor (how to run 2 services)
06:07 Travis "How to deploy FastAPI with Nginx and Supervisor"
10:20 The front end running inside the Docker image
15:10 view logs - Docker from vscode
17:48 project summary
------------------------------------------
FastAPI playlist : https://youtube.com/playlist?list=PLKMY3XNPiQ7uJpJ_PmfbnURiXpqL4umPM
FastAPI + Docker ~ Series playlist : https://youtube.com/playlist?list=PLKMY3XNPiQ7sBx6l6RyfGO3PvIO3sDv7y
Download the docker image and try it out for yourself!
$docker pull redandgreen/subnet-api:1
Full Project available here : ? https://github.com/RGGH/ip_checker
Useful links:
-----------------------------------------------------------------------------------------------
https://levelup.gitconnected.com/creating-an-api-with-fastapi-and-docker-809429d778e6
https://techexpert.tips/docker/docker-container-running-multiple-services/
https://docs.docker.com/config/containers/multi-service_container/
https://linuxhint.com/how-to-create-a-docker-image/
https://phoenixnap.com/kb/how-to-commit-changes-to-docker-image
----------------------------------------------------------------------------------------------
? https://github.com/RGGH/ip_checker/blob/main/misc/supervisord for the supervisor config
? 10% off ScraperAPI : https://www.scraperapi.com?fpr=ken49
◼️ Coupon Code: DRPI10
(You can also get started with 5000 free API calls. No credit card required.)
? Become a patron ?
? https://www.patreon.com/drpi
Buy Dr Pi a coffee (or Tea)
☕ https://www.buymeacoffee.com/DrPi
? Proxies ?
=================================================
If you need a good, easy to use proxy, I was recommended this one, and having used ScraperAPI for a while I can vouch for them. If you were going to sign up anyway, then maybe you would be kind enough to use the link and the coupon code below?
You can also do a full working trial first as well, (unlike some other companies). The trial doesn't ask for any payment details either so all good! ?
? 10% off ScraperAPI : https://www.scraperapi.com?fpr=ken49
◼️ Coupon Code: DRPI10
https://linuxhint.com/how-to-create-a-docker-image/
Thumbs up yeah? (cos Algos..)
#docker #fastapi #python
...
https://www.youtube.com/watch?v=ftHIVbK7YSM