Updated scripts, YouTube cookies example, Solr docker compose configuration, and documentation.
This commit is contained in:
parent
5108f82095
commit
eba670b9f4
191
README.md
191
README.md
|
|
@ -1,23 +1,88 @@
|
|||
# Installing Git
|
||||
|
||||
On Ubuntu you can just do:
|
||||
|
||||
apt install git
|
||||
|
||||
Microsoft Documentation: [https://learn.microsoft.com/en-us/devops/develop/git/install-and-set-up-git](https://learn.microsoft.com/en-us/devops/develop/git/install-and-set-up-git)
|
||||
|
||||
Git SCM Downloads: [https://git-scm.com/downloads](https://git-scm.com/downloads)
|
||||
|
||||
# Install Git LFS
|
||||
|
||||
This enables the large file storage system. It needs to be done once per user account that uses it.
|
||||
|
||||
git lfs install
|
||||
|
||||
Documentation: [https://git-lfs.com/](https://git-lfs.com/)
|
||||
|
||||
# Cloning this Repository
|
||||
|
||||
You can use git to clone this repository.
|
||||
|
||||
git clone https://gitea.daball.me/No-Moss-3-Carbo-Landfill-Online-Library/no-moss-3-carbo-landfill-library.online.git
|
||||
|
||||
# Installing NPM Application Dependencies
|
||||
# Pulling Git Source Updates
|
||||
|
||||
You can use npm to install the dependencies.
|
||||
From inside the working directory, you can use git to pull source updates from the origin repository.
|
||||
|
||||
npm install
|
||||
git pull
|
||||
|
||||
# Download Git LFS
|
||||
|
||||
From inside the working directory, download all the large file storage updates from the origin server.
|
||||
|
||||
git lfs pull
|
||||
|
||||
# Install Docker and Docker Compose (optional)
|
||||
|
||||
Installing Docker is beyond the scope of this document. You may choose to install Docker to simplify running this web site.
|
||||
|
||||
[https://docs.docker.com/get-docker/](https://docs.docker.com/get-docker/)
|
||||
|
||||
# Install Solr with Tika and Tesseract
|
||||
|
||||
Installing Solr with Tika and Tesseract is beyond the scope of this document. It is the search engine I am currently using.
|
||||
|
||||
From inside the solr folder, use Docker Compose bring up Solr and Tika. Tesseract will be installed in the Tika instance.
|
||||
|
||||
docker compose up
|
||||
|
||||
From inside the working directory, go into the solr folder and use Docker Compose to bring up the instances.
|
||||
|
||||
cd solr
|
||||
docker compose up
|
||||
|
||||
Take note of your Docker hostname. Docker should be exposing port 8983 for the solr instance and port 9998 for the tika instance.
|
||||
|
||||
Solr Test URL: [http://localhost:8983/](http://localhost:8983/)
|
||||
|
||||
Tika Test URL: [http://localhost:9998/](http://localhost:9998/)
|
||||
|
||||
In case of trouble with accessing them from outside the localhost, please check and ensure the exposed Solr port is allowed through your firewall for the web server host to access. Please permit the Tika and the Solr port through your firewall in order to for any npm workers to request the plaintext for documents.
|
||||
|
||||
# `yt-dlp` Pre-requisite: ffmpeg
|
||||
|
||||
Before you can re-encode the yt-dlp output, you will need to install ffmpeg and have the `ffmpeg` binary available in your `PATH`.
|
||||
|
||||
[https://ffmpeg.org/download.html](https://ffmpeg.org/download.html)
|
||||
|
||||
# Required: `yt-dlp`
|
||||
|
||||
You need to install yt-dlp and have the binary `yt-dlp` binary available in your `PATH`.
|
||||
|
||||
Project: [https://github.com/yt-dlp/yt-dlp](https://github.com/yt-dlp/yt-dlp)
|
||||
|
||||
Standalone Binaries: [https://github.com/yt-dlp/yt-dlp#release-files](https://github.com/yt-dlp/yt-dlp#release-files)
|
||||
|
||||
Installation: [https://github.com/yt-dlp/yt-dlp/wiki/Installation](https://github.com/yt-dlp/yt-dlp/wiki/Installation)
|
||||
|
||||
# *Windows*: Reload Environment
|
||||
|
||||
In Windows you will need to log out and back in after updating your environment variables. In PowerShell you can reload your `PATH` environment variable using:
|
||||
|
||||
$env:Path = [System.Environment]::GetEnvironmentVariable("Path","Machine") + ";" + [System.Environment]::GetEnvironmentVariable("Path","User")
|
||||
|
||||
# Downloading YouTube Archives
|
||||
|
||||
You may need to use your web browser to login to YouTube. You can then copy each cookie value out of your web browser using the Developer Tools Network tab. Gather the correct values to build a `youtube-cookies.txt` file in your working directory using something like the `youtube-example-cookies.txt` template here:
|
||||
|
|
@ -52,14 +117,122 @@ You can update the copy of the Virginia code:
|
|||
|
||||
mirror-virginia-law.ps1
|
||||
|
||||
# Install Docker (optional)
|
||||
# (Optional) Clearing the Search Index
|
||||
|
||||
Installing Docker is beyond the scope of this document. You may choose to install Docker to simplify running this web site.
|
||||
You can clear the search index using npm:
|
||||
|
||||
# Install Solr with Tika and Tesseract
|
||||
npm run-script index:clear
|
||||
|
||||
Installing Solr with Tika and Tesseract is beyond the scope of this document. It is the search engine I use.
|
||||
# (Optional) Rebuild the Search Index
|
||||
|
||||
**TODO**: Document installation with Docker.
|
||||
You can clear and rebuild the search index in one go using npm:
|
||||
|
||||
npm run-script index:reindex
|
||||
|
||||
# Incrementally Build the Document Search Index
|
||||
|
||||
You can scan all of the documents into the search index incrementally using npm. If the index is empty, all the documents will be scanned. If the file exists in the index, it's hash is checked before scanning the document's text. The presumption is that a second scan would produce the same text, and the scan is computationally expensive. Using npm, incrementally build the search index:
|
||||
|
||||
npm run-script index:docs
|
||||
|
||||
# Incrementally Build the Laws Search Index
|
||||
|
||||
You can scan all of the laws into the search index incrementally using npm. If the index is empty, all the documents will be scanned. If the file exists in the index, it's hash is checked before scanning the document's text. The presumption is that a second scan would produce the same text, and the scan is computationally expensive. Using npm, incrementally build the search index:
|
||||
|
||||
npm run-script index:laws
|
||||
|
||||
# Installing Node.js
|
||||
|
||||
In order to run the web app server, you must have Node.js installed.
|
||||
|
||||
[https://nodejs.org/](https://nodejs.org/)
|
||||
|
||||
# Installing NPM
|
||||
Your Node.js should come with NPM. If you don't have it, see these directions:
|
||||
|
||||
[https://docs.npmjs.com/downloading-and-installing-node-js-and-npm](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm)
|
||||
|
||||
# Installing Web App Server Dependencies
|
||||
|
||||
You can use npm to install the web app server dependencies.
|
||||
|
||||
npm install
|
||||
|
||||
# Before Running the Node Server
|
||||
|
||||
You may need to transpile the TypeScript to JavaScript.
|
||||
|
||||
npm run-script transpile:ts
|
||||
|
||||
# Running the Node Server
|
||||
|
||||
You can run the web server using npm.
|
||||
|
||||
npm run-script server
|
||||
|
||||
Or you can use Node.js directly.
|
||||
|
||||
node app/server.js
|
||||
|
||||
# Reverse Proxy using Caddy
|
||||
|
||||
If you want to use Caddy as a reverse proxy to the web application, try a Caddyfile like this:
|
||||
|
||||
www.no-moss-3-carbo-landfill-library.online {
|
||||
redir https://no-moss-3-carbo-landfill-library.online{uri}
|
||||
}
|
||||
|
||||
no-moss-3-carbo-landfill-library.online {
|
||||
reverse_proxy localhost:3000
|
||||
}
|
||||
|
||||
# Reverse Proxy using Nginx
|
||||
|
||||
If you want to use Nginx as a reverse proxy to the web application, try a configuration file like this:
|
||||
|
||||
server {
|
||||
listen 80;
|
||||
server_name www.no-moss-3-carbo-landfill-library.online;
|
||||
location / {
|
||||
rewrite ^/(.*) http://no-moss-3-carbo-landfill-library.online/$1 permanent;
|
||||
}
|
||||
}
|
||||
server {
|
||||
listen 80;
|
||||
server_name no-moss-3-carbo-landfill-library.online;
|
||||
location / {
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header Host $host;
|
||||
proxy_pass http://127.0.0.1:3000;
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
}
|
||||
}
|
||||
|
||||
# Securing Nginx
|
||||
|
||||
If you want a free SSL certification for Nginx try the certbot to keep your certificates up-to-date.
|
||||
|
||||
[https://letsencrypt.org/docs/client-options/](https://letsencrypt.org/docs/client-options/)
|
||||
|
||||
# Reverse Proxy Using IISNode
|
||||
|
||||
If you want to use IIS to access the Node instance, this can be greatly simplified by using IISNode.
|
||||
|
||||
IISNode requires the URL_rewrite module for IIS.
|
||||
|
||||
[https://iis-umbraco.azurewebsites.net/downloads/microsoft/url-rewrite](https://iis-umbraco.azurewebsites.net/downloads/microsoft/url-rewrite)
|
||||
|
||||
Then you can set up IISNode.
|
||||
|
||||
[https://github.com/Azure/iisnode](https://github.com/Azure/iisnode)
|
||||
|
||||
The included web.config is an example configuration for IISNode.
|
||||
|
||||
# Post-Installation Considerations
|
||||
|
||||
If running the web server application outside of Docker or IISNode, consider using a daemonizer such as PM2.
|
||||
|
||||
[https://pm2.io/docs/runtime/guide/installation/](https://pm2.io/docs/runtime/guide/installation/)
|
||||
|
||||
**TODO**: Finish these instructions.
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@ mkdir public
|
|||
cd public
|
||||
mkdir Virginia_Law_Library
|
||||
cd Virginia_Law_Library
|
||||
echo Downloading Virginia Law Library.
|
||||
wget https://law.lis.virginia.gov/library/constitution/constitution.epub -OutFile constitution.epub
|
||||
wget https://law.lis.virginia.gov/library/constitution/constitution.pdf -OutFile constitution.pdf
|
||||
wget https://law.lis.virginia.gov/CSV/Constitution.csv -OutFile Constitution.csv
|
||||
|
|
|
|||
|
|
@ -1,3 +0,0 @@
|
|||
S:\bin\yt-dlp.exe --live-from-start --yes-playlist -N 8 -R infinite -c --no-force-overwrites --mtime --write-description --write-info-json --write-playlist-metafiles --write-comments --no-cookies-from-browser --cookies S:\srv\www\no-moss-3-carbo-landfill-library.online\youtube-cookies.txt --write-thumbnail --write-all-thumbnails --write-url-link --write-webloc-link --write-desktop-link --progress --video-multistreams --audio-multistreams --write-subs --write-auto-subs --embed-subs --embed-thumbnail --embed-metadata --embed-chapters --embed-info-json -o "S:\srv\www\no-moss-3-carbo-landfill-library.online\YouTube\%%(uploader_id)s\%%(upload_date>%%Y-%%m-%%d)s-%%(title)s\%%(id)s.%%(ext)s" "https://www.youtube.com/watch?v=WMEw18t9p1Q"
|
||||
S:\bin\yt-dlp.exe --live-from-start --yes-playlist -N 8 -R infinite -c --no-force-overwrites --mtime --write-description --write-info-json --write-playlist-metafiles --write-comments --no-cookies-from-browser --cookies S:\srv\www\no-moss-3-carbo-landfill-library.online\youtube-cookies.txt --write-thumbnail --write-all-thumbnails --write-url-link --write-webloc-link --write-desktop-link --progress --video-multistreams --audio-multistreams --write-subs --write-auto-subs --embed-subs --embed-thumbnail --embed-metadata --embed-chapters --embed-info-json -o "S:\srv\www\no-moss-3-carbo-landfill-library.online\YouTube\%%(uploader_id)s\%%(upload_date>%%Y-%%m-%%d)s-%%(title)s\%%(id)s.%%(ext)s" "https://www.youtube.com/watch?v=lZfmk1RPdbk"
|
||||
S:\bin\yt-dlp.exe --live-from-start --yes-playlist -N 8 -R infinite -c --no-force-overwrites --mtime --write-description --write-info-json --write-playlist-metafiles --write-comments --no-cookies-from-browser --cookies S:\srv\www\no-moss-3-carbo-landfill-library.online\youtube-cookies.txt --write-thumbnail --write-all-thumbnails --write-url-link --write-webloc-link --write-desktop-link --progress --video-multistreams --audio-multistreams --write-subs --write-auto-subs --embed-subs --embed-thumbnail --embed-metadata --embed-chapters --embed-info-json -o "S:\srv\www\no-moss-3-carbo-landfill-library.online\YouTube\%%(uploader_id)s\%%(upload_date>%%Y-%%m-%%d)s-%%(title)s\%%(id)s.%%(ext)s" "https://www.youtube.com/watch?v=RRGeIh_fh_M"
|
||||
|
|
@ -1,5 +1,12 @@
|
|||
@echo off
|
||||
S:\bin\yt-dlp.exe -U
|
||||
echo Downloading https://www.youtube.com/@russellcountyvirginia8228
|
||||
S:\bin\yt-dlp.exe --live-from-start --yes-playlist -N 8 -R infinite -c --no-force-overwrites --mtime --write-description --write-info-json --write-playlist-metafiles --write-comments --no-cookies-from-browser --cookies S:\srv\www\no-moss-3-carbo-landfill-library.online\youtube-cookies.txt --write-thumbnail --write-all-thumbnails --write-url-link --write-webloc-link --write-desktop-link --progress --video-multistreams --audio-multistreams --write-subs --write-auto-subs --embed-subs --embed-thumbnail --embed-metadata --embed-chapters --embed-info-json -o "S:\srv\www\no-moss-3-carbo-landfill-library.online\YouTube\%%(uploader_id)s\%%(upload_date>%%Y-%%m-%%d)s-%%(title)s\%%(id)s.%%(ext)s" "https://www.youtube.com/@russellcountyvirginia8228"
|
||||
echo Downloading https://www.youtube.com/@VADMME
|
||||
S:\bin\yt-dlp.exe --live-from-start --yes-playlist -N 8 -R infinite -c --no-force-overwrites --mtime --write-description --write-info-json --write-playlist-metafiles --write-comments --no-cookies-from-browser --cookies S:\srv\www\no-moss-3-carbo-landfill-library.online\youtube-cookies.txt --write-thumbnail --write-all-thumbnails --write-url-link --write-webloc-link --write-desktop-link --progress --video-multistreams --audio-multistreams --write-subs --write-auto-subs --embed-subs --embed-thumbnail --embed-metadata --embed-chapters --embed-info-json -o "S:\srv\www\no-moss-3-carbo-landfill-library.online\YouTube\%%(uploader_id)s\%%(upload_date>%%Y-%%m-%%d)s-%%(title)s\%%(id)s.%%(ext)s" "https://www.youtube.com/@VADMME"
|
||||
npm run-script index:docs
|
||||
echo Downloading https://www.youtube.com/watch?v=WMEw18t9p1Q
|
||||
S:\bin\yt-dlp.exe --live-from-start --yes-playlist -N 8 -R infinite -c --no-force-overwrites --mtime --write-description --write-info-json --write-playlist-metafiles --write-comments --no-cookies-from-browser --cookies S:\srv\www\no-moss-3-carbo-landfill-library.online\youtube-cookies.txt --write-thumbnail --write-all-thumbnails --write-url-link --write-webloc-link --write-desktop-link --progress --video-multistreams --audio-multistreams --write-subs --write-auto-subs --embed-subs --embed-thumbnail --embed-metadata --embed-chapters --embed-info-json -o "S:\srv\www\no-moss-3-carbo-landfill-library.online\YouTube\%%(uploader_id)s\%%(upload_date>%%Y-%%m-%%d)s-%%(title)s\%%(id)s.%%(ext)s" "https://www.youtube.com/watch?v=WMEw18t9p1Q"
|
||||
echo Downloading https://www.youtube.com/watch?v=lZfmk1RPdbk
|
||||
S:\bin\yt-dlp.exe --live-from-start --yes-playlist -N 8 -R infinite -c --no-force-overwrites --mtime --write-description --write-info-json --write-playlist-metafiles --write-comments --no-cookies-from-browser --cookies S:\srv\www\no-moss-3-carbo-landfill-library.online\youtube-cookies.txt --write-thumbnail --write-all-thumbnails --write-url-link --write-webloc-link --write-desktop-link --progress --video-multistreams --audio-multistreams --write-subs --write-auto-subs --embed-subs --embed-thumbnail --embed-metadata --embed-chapters --embed-info-json -o "S:\srv\www\no-moss-3-carbo-landfill-library.online\YouTube\%%(uploader_id)s\%%(upload_date>%%Y-%%m-%%d)s-%%(title)s\%%(id)s.%%(ext)s" "https://www.youtube.com/watch?v=lZfmk1RPdbk"
|
||||
echo Downloading https://www.youtube.com/watch?v=RRGeIh_fh_M
|
||||
S:\bin\yt-dlp.exe --live-from-start --yes-playlist -N 8 -R infinite -c --no-force-overwrites --mtime --write-description --write-info-json --write-playlist-metafiles --write-comments --no-cookies-from-browser --cookies S:\srv\www\no-moss-3-carbo-landfill-library.online\youtube-cookies.txt --write-thumbnail --write-all-thumbnails --write-url-link --write-webloc-link --write-desktop-link --progress --video-multistreams --audio-multistreams --write-subs --write-auto-subs --embed-subs --embed-thumbnail --embed-metadata --embed-chapters --embed-info-json -o "S:\srv\www\no-moss-3-carbo-landfill-library.online\YouTube\%%(uploader_id)s\%%(upload_date>%%Y-%%m-%%d)s-%%(title)s\%%(id)s.%%(ext)s" "https://www.youtube.com/watch?v=RRGeIh_fh_M"
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
# Netscape HTTP Cookie File
|
||||
# This file is generated by yt-dlp. Do not edit. I think I might have once needed to create it first in Edge and copy my values from my logged in browser.
|
||||
# This file is generated by yt-dlp. Do not edit.
|
||||
|
||||
.youtube.com TRUE / TRUE ... GPS ...
|
||||
.youtube.com TRUE / FALSE ... PREF tz=...&f6=...&f7=...&hl=...
|
||||
|
|
|
|||
Loading…
Reference in New Issue
Block a user