Updated scripts, YouTube cookies example, docker compose configuration, and documentation.
This commit is contained in:
parent
3ac71738e5
commit
555e9f52bd
193
README.md
193
README.md
|
@ -1,16 +1,75 @@
|
||||||
|
# Installing Git
|
||||||
|
|
||||||
|
On Ubuntu you can just do:
|
||||||
|
|
||||||
|
apt install git
|
||||||
|
|
||||||
|
Microsoft Documentation: [https://learn.microsoft.com/en-us/devops/develop/git/install-and-set-up-git](https://learn.microsoft.com/en-us/devops/develop/git/install-and-set-up-git)
|
||||||
|
|
||||||
|
Git SCM Downloads: [https://git-scm.com/downloads](https://git-scm.com/downloads)
|
||||||
|
|
||||||
|
# Installing Git LFS
|
||||||
|
|
||||||
|
This enables the large file storage system. It needs to be done once per user account that uses it.
|
||||||
|
|
||||||
|
git lfs install
|
||||||
|
|
||||||
|
Documentation: [https://git-lfs.com/](https://git-lfs.com/)
|
||||||
|
|
||||||
# Cloning this Repository
|
# Cloning this Repository
|
||||||
|
|
||||||
You can use git to clone this repository.
|
You can use git to clone this repository.
|
||||||
|
|
||||||
git clone https://gitea.daball.me/No-Moss-3-Carbo-Landfill-Online-Library/no-moss-3-carbo-landfill-library.online.git
|
git clone https://gitea.daball.me/No-Moss-3-Carbo-Landfill-Online-Library/no-moss-3-carbo-landfill-library.online.git
|
||||||
|
|
||||||
# Installing NPM Application Dependencies
|
# Pulling Git Source Updates
|
||||||
|
|
||||||
You can use npm to install the dependencies.
|
From inside the working directory, you can use git to pull source updates from the origin repository.
|
||||||
|
|
||||||
npm install
|
git pull
|
||||||
|
|
||||||
# Required: `yt-dlp`
|
# Pulling Git LFS Updates
|
||||||
|
|
||||||
|
From inside the working directory, download all the large file storage updates from the origin server.
|
||||||
|
|
||||||
|
git lfs pull
|
||||||
|
|
||||||
|
# Installing Docker and Docker Compose (optional)
|
||||||
|
|
||||||
|
Installing Docker is beyond the scope of this document. You may choose to install Docker to simplify running this web site.
|
||||||
|
|
||||||
|
[https://docs.docker.com/get-docker/](https://docs.docker.com/get-docker/)
|
||||||
|
|
||||||
|
# Installing Solr with Tika and Tesseract
|
||||||
|
|
||||||
|
Installing Solr with Tika and Tesseract is beyond the scope of this document. It is the search engine I am currently using.
|
||||||
|
|
||||||
|
From inside the solr folder, use Docker Compose bring up Solr and Tika. Tesseract will be installed in the Tika instance.
|
||||||
|
|
||||||
|
docker compose up
|
||||||
|
|
||||||
|
From inside the working directory, go into the solr folder and use Docker Compose to bring up the instances.
|
||||||
|
|
||||||
|
cd solr
|
||||||
|
docker compose up
|
||||||
|
|
||||||
|
Take note of your Docker hostname. Docker should be exposing port 8983 for the solr instance and port 9998 for the tika instance.
|
||||||
|
|
||||||
|
Solr Test URL: [http://localhost:8983/](http://localhost:8983/)
|
||||||
|
|
||||||
|
Tika Test URL: [http://localhost:9998/](http://localhost:9998/)
|
||||||
|
|
||||||
|
In case of trouble with accessing them from outside the localhost, please check and ensure the exposed Solr port is allowed through your firewall for the web server host to access. Please permit the Tika and the Solr port through your firewall in order to for any npm workers to request the plaintext for documents.
|
||||||
|
|
||||||
|
# Installing ffmpeg
|
||||||
|
|
||||||
|
Before you can re-encode the yt-dlp output, you will need to install ffmpeg and have the `ffmpeg` binary available in your `PATH`.
|
||||||
|
|
||||||
|
[https://ffmpeg.org/download.html](https://ffmpeg.org/download.html)
|
||||||
|
|
||||||
|
# Installing yt-dlp
|
||||||
|
|
||||||
|
You need to install yt-dlp and have the binary `yt-dlp` binary available in your `PATH`.
|
||||||
|
|
||||||
Project: [https://github.com/yt-dlp/yt-dlp](https://github.com/yt-dlp/yt-dlp)
|
Project: [https://github.com/yt-dlp/yt-dlp](https://github.com/yt-dlp/yt-dlp)
|
||||||
|
|
||||||
|
@ -18,6 +77,12 @@ Standalone Binaries: [https://github.com/yt-dlp/yt-dlp#release-files](https://gi
|
||||||
|
|
||||||
Installation: [https://github.com/yt-dlp/yt-dlp/wiki/Installation](https://github.com/yt-dlp/yt-dlp/wiki/Installation)
|
Installation: [https://github.com/yt-dlp/yt-dlp/wiki/Installation](https://github.com/yt-dlp/yt-dlp/wiki/Installation)
|
||||||
|
|
||||||
|
# *Windows*: Reloading Environment
|
||||||
|
|
||||||
|
In Windows you will need to log out and back in after updating your environment variables. In PowerShell you can reload your `PATH` environment variable using:
|
||||||
|
|
||||||
|
$env:Path = [System.Environment]::GetEnvironmentVariable("Path","Machine") + ";" + [System.Environment]::GetEnvironmentVariable("Path","User")
|
||||||
|
|
||||||
# Downloading YouTube Archives
|
# Downloading YouTube Archives
|
||||||
|
|
||||||
You may need to use your web browser to login to YouTube. You can then copy each cookie value out of your web browser using the Developer Tools Network tab. Gather the correct values to build a `youtube-cookies.txt` file in your working directory using something like the `youtube-example-cookies.txt` template here:
|
You may need to use your web browser to login to YouTube. You can then copy each cookie value out of your web browser using the Developer Tools Network tab. Gather the correct values to build a `youtube-cookies.txt` file in your working directory using something like the `youtube-example-cookies.txt` template here:
|
||||||
|
@ -52,14 +117,122 @@ You can update the copy of the Virginia code:
|
||||||
|
|
||||||
mirror-virginia-law.ps1
|
mirror-virginia-law.ps1
|
||||||
|
|
||||||
# Install Docker (optional)
|
# Installing Node.js
|
||||||
|
|
||||||
Installing Docker is beyond the scope of this document. You may choose to install Docker to simplify running this web site.
|
In order to run the web app server, you must have Node.js installed.
|
||||||
|
|
||||||
# Install Solr with Tika and Tesseract
|
[https://nodejs.org/](https://nodejs.org/)
|
||||||
|
|
||||||
Installing Solr with Tika and Tesseract is beyond the scope of this document. It is the search engine I use.
|
# Installing NPM
|
||||||
|
|
||||||
**TODO**: Document installation with Docker.
|
Your Node.js should come with NPM. If you don't have it, see these directions:
|
||||||
|
|
||||||
**TODO**: Finish these instructions.
|
[https://docs.npmjs.com/downloading-and-installing-node-js-and-npm](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm)
|
||||||
|
|
||||||
|
# Installing Web App Server Dependencies
|
||||||
|
|
||||||
|
You can use npm to install the web app server dependencies.
|
||||||
|
|
||||||
|
npm install
|
||||||
|
|
||||||
|
# Before Running the Node Server
|
||||||
|
|
||||||
|
You may need to transpile the TypeScript to JavaScript.
|
||||||
|
|
||||||
|
npm run-script transpile:ts
|
||||||
|
|
||||||
|
# (Optional) Clearing the Search Index
|
||||||
|
|
||||||
|
You can clear the search index using npm:
|
||||||
|
|
||||||
|
npm run-script index:clear
|
||||||
|
|
||||||
|
# (Optional) Rebuilding the Search Index
|
||||||
|
|
||||||
|
You can clear and rebuild the search index in one go using npm:
|
||||||
|
|
||||||
|
npm run-script index:reindex
|
||||||
|
|
||||||
|
# Incrementally Building the Document Search Index
|
||||||
|
|
||||||
|
You can scan all of the documents into the search index incrementally using npm. If the index is empty, all the documents will be scanned. If the file exists in the index, it's hash is checked before scanning the document's text. The presumption is that a second scan would produce the same text, and the scan is computationally expensive. Using npm, incrementally build the search index:
|
||||||
|
|
||||||
|
npm run-script index:docs
|
||||||
|
|
||||||
|
# Incrementally Building the Laws Search Index
|
||||||
|
|
||||||
|
You can scan all of the laws into the search index incrementally using npm. If the index is empty, all the documents will be scanned. If the file exists in the index, it's hash is checked before scanning the document's text. The presumption is that a second scan would produce the same text, and the scan is computationally expensive. Using npm, incrementally build the search index:
|
||||||
|
|
||||||
|
npm run-script index:laws
|
||||||
|
|
||||||
|
# Running the Node Server
|
||||||
|
|
||||||
|
You can run the web server using npm.
|
||||||
|
|
||||||
|
npm run-script server
|
||||||
|
|
||||||
|
Or you can use Node.js directly.
|
||||||
|
|
||||||
|
node app/server.js
|
||||||
|
|
||||||
|
# Reverse Proxy using Caddy
|
||||||
|
|
||||||
|
If you want to use Caddy as a reverse proxy to the web application, try a Caddyfile like this:
|
||||||
|
|
||||||
|
www.no-moss-3-carbo-landfill-library.online {
|
||||||
|
redir https://no-moss-3-carbo-landfill-library.online{uri}
|
||||||
|
}
|
||||||
|
|
||||||
|
no-moss-3-carbo-landfill-library.online {
|
||||||
|
reverse_proxy localhost:3000
|
||||||
|
}
|
||||||
|
|
||||||
|
# Reverse Proxy using Nginx
|
||||||
|
|
||||||
|
If you want to use Nginx as a reverse proxy to the web application, try a configuration file like this:
|
||||||
|
|
||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name www.no-moss-3-carbo-landfill-library.online;
|
||||||
|
location / {
|
||||||
|
rewrite ^/(.*) http://no-moss-3-carbo-landfill-library.online/$1 permanent;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name no-moss-3-carbo-landfill-library.online;
|
||||||
|
location / {
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_pass http://127.0.0.1:3000;
|
||||||
|
proxy_http_version 1.1;
|
||||||
|
proxy_set_header Upgrade $http_upgrade;
|
||||||
|
proxy_set_header Connection "upgrade";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Securing Nginx
|
||||||
|
|
||||||
|
If you want a free SSL certification for Nginx try the certbot to keep your certificates up-to-date.
|
||||||
|
|
||||||
|
[https://letsencrypt.org/docs/client-options/](https://letsencrypt.org/docs/client-options/)
|
||||||
|
|
||||||
|
# Reverse Proxy Using IISNode
|
||||||
|
|
||||||
|
If you want to use IIS to access the Node instance, this can be greatly simplified by using IISNode.
|
||||||
|
|
||||||
|
IISNode requires the URL_rewrite module for IIS.
|
||||||
|
|
||||||
|
[https://iis-umbraco.azurewebsites.net/downloads/microsoft/url-rewrite](https://iis-umbraco.azurewebsites.net/downloads/microsoft/url-rewrite)
|
||||||
|
|
||||||
|
Then you can set up IISNode.
|
||||||
|
|
||||||
|
[https://github.com/Azure/iisnode](https://github.com/Azure/iisnode)
|
||||||
|
|
||||||
|
The included web.config is an example configuration for IISNode.
|
||||||
|
|
||||||
|
# Post-Installation Considerations
|
||||||
|
|
||||||
|
If running the web server application outside of Docker or IISNode, consider using a daemonizer such as PM2.
|
||||||
|
|
||||||
|
[https://pm2.io/docs/runtime/guide/installation/](https://pm2.io/docs/runtime/guide/installation/)
|
||||||
|
|
15
docker/README.md
Normal file
15
docker/README.md
Normal file
|
@ -0,0 +1,15 @@
|
||||||
|
# Docker Compose configurations
|
||||||
|
|
||||||
|
Here are some Docker Compose configuration examples.
|
||||||
|
|
||||||
|
## solr
|
||||||
|
|
||||||
|
Minimal Solr instance.
|
||||||
|
|
||||||
|
## solr-tika
|
||||||
|
|
||||||
|
Solr and Tika with Tesseract together.
|
||||||
|
|
||||||
|
## tika
|
||||||
|
|
||||||
|
Minimal Tika with Tesseract instance.
|
|
@ -8,7 +8,7 @@ services:
|
||||||
ports:
|
ports:
|
||||||
- "8983:8983"
|
- "8983:8983"
|
||||||
volumes:
|
volumes:
|
||||||
- ./solr-data:/var/solr
|
- ../solr-data:/var/solr
|
||||||
# environment:
|
# environment:
|
||||||
# - SOLR_CORE=my_core
|
# - SOLR_CORE=my_core
|
||||||
command:
|
command:
|
||||||
|
@ -22,13 +22,13 @@ services:
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
entrypoint: [ "/bin/sh", "-c", "exec java -cp \"/customocr:/tika-server-standard-2.9.2.jar:/tika-extras/*\" org.apache.tika.server.core.TikaServerCli -h 0.0.0.0 $$0 $$@"]
|
entrypoint: [ "/bin/sh", "-c", "exec java -cp \"/customocr:/tika-server-standard-2.9.2.jar:/tika-extras/*\" org.apache.tika.server.core.TikaServerCli -h 0.0.0.0 $$0 $$@"]
|
||||||
environment:
|
environment:
|
||||||
- TIKA_LOG_PATH=/tika.log
|
- TIKA_LOG_PATH=/tika-logs
|
||||||
command: -c /tika-config.xml
|
command: -c /tika-config.xml
|
||||||
volumes:
|
volumes:
|
||||||
- ./tika-config.xml:/tika-config.xml
|
- ../tika-config.xml:/tika-config.xml
|
||||||
- ./tika-data/logs/tika.log:/tika.log
|
- ../tika-data/logs:/tika-logs
|
||||||
- ./tika-data/log4j2.xml:/log4j2.xml
|
- ../tika-data/log4j2.xml:/log4j2.xml
|
||||||
- ./TesseractOCRConfig.properties:/TesseractOCRConfig.properties
|
- ../TesseractOCRConfig.properties:/TesseractOCRConfig.properties
|
||||||
ports:
|
ports:
|
||||||
- "9998:9998"
|
- "9998:9998"
|
||||||
|
|
18
docker/solr/docker-compose.yml
Normal file
18
docker/solr/docker-compose.yml
Normal file
|
@ -0,0 +1,18 @@
|
||||||
|
version: '3'
|
||||||
|
|
||||||
|
services:
|
||||||
|
solr:
|
||||||
|
image: solr:latest
|
||||||
|
container_name: solr
|
||||||
|
restart: unless-stopped
|
||||||
|
ports:
|
||||||
|
- "8983:8983"
|
||||||
|
volumes:
|
||||||
|
- ../solr-data:/var/solr
|
||||||
|
# environment:
|
||||||
|
# - SOLR_CORE=my_core
|
||||||
|
command:
|
||||||
|
- solr-precreate
|
||||||
|
# - gettingstarted
|
||||||
|
user: "1000:995"
|
||||||
|
|
18
docker/tika/docker-compose.yml
Normal file
18
docker/tika/docker-compose.yml
Normal file
|
@ -0,0 +1,18 @@
|
||||||
|
version: '3'
|
||||||
|
|
||||||
|
services:
|
||||||
|
solr:
|
||||||
|
image: solr:latest
|
||||||
|
container_name: solr
|
||||||
|
restart: unless-stopped
|
||||||
|
ports:
|
||||||
|
- "8983:8983"
|
||||||
|
volumes:
|
||||||
|
- ../solr-data:/var/solr
|
||||||
|
# environment:
|
||||||
|
# - SOLR_CORE=my_core
|
||||||
|
command:
|
||||||
|
- solr-precreate
|
||||||
|
# - gettingstarted
|
||||||
|
user: "1000:995"
|
||||||
|
|
|
@ -2,6 +2,7 @@ mkdir public
|
||||||
cd public
|
cd public
|
||||||
mkdir Virginia_Law_Library
|
mkdir Virginia_Law_Library
|
||||||
cd Virginia_Law_Library
|
cd Virginia_Law_Library
|
||||||
|
echo Downloading Virginia Law Library.
|
||||||
wget https://law.lis.virginia.gov/library/constitution/constitution.epub -OutFile constitution.epub
|
wget https://law.lis.virginia.gov/library/constitution/constitution.epub -OutFile constitution.epub
|
||||||
wget https://law.lis.virginia.gov/library/constitution/constitution.pdf -OutFile constitution.pdf
|
wget https://law.lis.virginia.gov/library/constitution/constitution.pdf -OutFile constitution.pdf
|
||||||
wget https://law.lis.virginia.gov/CSV/Constitution.csv -OutFile Constitution.csv
|
wget https://law.lis.virginia.gov/CSV/Constitution.csv -OutFile Constitution.csv
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user