Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add sqlite db #12

Merged
merged 49 commits into from
Sep 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
62e8a87
chore: bump version and move config to internal
husni-zuhdi Sep 1, 2024
ff74146
chore: ignore sqlite files
husni-zuhdi Sep 1, 2024
36eacfa
feat: initial migrations for sqlite
husni-zuhdi Sep 1, 2024
fd75654
feat: implement hexagonal architecture for husni-portfolio
husni-zuhdi Sep 1, 2024
3c60d89
feat: update data model and add memory as database
husni-zuhdi Sep 1, 2024
ecf039f
feat: update router, handler, github api, and utils
husni-zuhdi Sep 1, 2024
07d00e8
chore: update sqlx task related and update env.example
husni-zuhdi Sep 1, 2024
3545dc7
chore: add dyn-clone dependencies
husni-zuhdi Sep 1, 2024
7c4b179
chore: remove commented code, remove unused functions, and trimming l…
husni-zuhdi Sep 1, 2024
e22bde5
feat: implement dyn-clone lib
husni-zuhdi Sep 1, 2024
76557fd
feat: update naming and implement Arc Mutex
husni-zuhdi Sep 1, 2024
fbbcaf1
chore: update readme to replace actix -> axum
husni-zuhdi Sep 1, 2024
a1e65f4
chore: exploring markdown-rs frontmatter construct related to md yaml…
husni-zuhdi Sep 2, 2024
73ef3d2
fix: typo in arg
husni-zuhdi Sep 2, 2024
e10f9ac
chore: remove postgre envar and add sqlite/data envar
husni-zuhdi Sep 2, 2024
609d59c
feat: add axum macro feature to help debug handler
husni-zuhdi Sep 2, 2024
732374b
feat: update blog hexagonal arch to support async_trait
husni-zuhdi Sep 2, 2024
4c2f6c3
feat: implement FromRow for Blog and add BlogPagination for query param
husni-zuhdi Sep 2, 2024
4444bdc
feat: add sqlite db layer and update memory db to use async fn
husni-zuhdi Sep 2, 2024
75d2f54
feat: remove postgre envar and add sqlite/data envar
husni-zuhdi Sep 2, 2024
4263ba1
chore: improve build state fn
husni-zuhdi Sep 2, 2024
b50d111
chore: change &mut self to &self for queries and add Sync trait
husni-zuhdi Sep 2, 2024
37e73d5
feat: change std mutex to tokio mutex and update FromRow BlogSource i…
husni-zuhdi Sep 2, 2024
3cade23
chore: change &mut self to &self for db repo and fix several sqlite q…
husni-zuhdi Sep 2, 2024
b9dd825
feat: update config::from_envar implementation
husni-zuhdi Sep 2, 2024
bc4e16d
feat: add debug_handler, improve pagination, and apply tokio mutex
husni-zuhdi Sep 2, 2024
3c34a7b
feat: add inital sqlite migration
husni-zuhdi Sep 2, 2024
ca27c15
chore: add docs
husni-zuhdi Sep 3, 2024
5625ecf
chore: change default endpoint and environment naming
husni-zuhdi Sep 3, 2024
0650bdc
feat: improve pagionation on memory repo
husni-zuhdi Sep 3, 2024
f0e5f00
fix: update command port naming
husni-zuhdi Sep 3, 2024
95f13f9
chore: tidy up Version model
husni-zuhdi Sep 3, 2024
7f00bc2
chore: separate BlogsTemplate from BlogTemplate data
husni-zuhdi Sep 3, 2024
cf37c3b
chore: update Version and BlogsTemplate implementation
husni-zuhdi Sep 3, 2024
f09bc2c
chore: refactor handler functions
husni-zuhdi Sep 3, 2024
f441ed6
feat: add api port, repo, and usecase
husni-zuhdi Sep 5, 2024
67e60f8
feat: add implementation of api usecase for github and filesystem
husni-zuhdi Sep 5, 2024
4285d6e
feat: add chekc_id method for blog
husni-zuhdi Sep 5, 2024
7b47d0a
feat: add BlogMetadata, BlogStored, and implement FromRow for BlogId
husni-zuhdi Sep 5, 2024
ec8f851
feat: add type related to GithubTree and enable access data on severa…
husni-zuhdi Sep 5, 2024
fe75c77
feat: add check_id method and remove api-related implementation
husni-zuhdi Sep 5, 2024
28ea638
feat: add check_id method
husni-zuhdi Sep 5, 2024
8f1d70c
feat: update state_factory to implement new api usecases
husni-zuhdi Sep 5, 2024
764f728
chore: change error.rs to status.rs and their implementation
Sep 6, 2024
0490227
chore: remove unused comment
Sep 6, 2024
42aa66e
chore: move processing markdwon function to a method under filesystem
Sep 6, 2024
6170fc0
chore: move method documentation
Sep 6, 2024
c80d715
chore: add TODO to change BlogId to int
Sep 6, 2024
d2091d8
chore: remove unused md_to_html
Sep 6, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
**/target
**/Cargo.lock
**/.env
husni-portfolio.db**
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,10 @@ My Portfolio webiste

## Tools I use in this repo
* [Rust Programming Language](https://www.rust-lang.org/)
* [Actix](https://actix.rs/)
* [Axum](https://github.com/tokio-rs/axum/tree/main)
* [Askama](https://github.com/djc/askama)
* [Markdown-rs](https://github.com/wooorm/markdown-rs)
* [Octocrab](https://github.com/XAMPPRocky/octocrab)
* [Taskfile](https://taskfile.dev/)
* [TailwindCSS](https://tailwindcss.com/)

Expand Down
21 changes: 18 additions & 3 deletions Taskfile.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
version: '3'
dotenv:
- '.env'

tasks:
test:
Expand All @@ -8,9 +10,22 @@ tasks:
run:
summary: Run application with hot-reload
cmds:
# Add tailwindcss build for hot reloading
- tailwindcss -i ./statics/input.css -o ./statics/styles.css
- cargo watch -s 'export $(cat .env | xargs) && cargo run -- -release'
- cargo watch -s 'tailwindcss -i ./statics/input.css -o ./statics/styles.css && export $(cat .env | xargs) && cargo run -- -release'

sqlx-create:
summary: Create db with sqlx
cmds:
- sqlx db create --database-url $DATABASE_URL
sqlx-migrate-run:
summary: Migrate db with sqlx
cmds:
- sqlx migrate run --source internal/migrations --database-url $DATABASE_URL
sqlx-migrate-add:
summary: Creata new migration db with sqlx. Please pass migration description too
vars:
DESCRIPTION: '{{index .MATCH 0}}'
cmds:
- sqlx migrate add --source internal/migrations --database_url $DATABASE_URL {{.DESCRIPTION}}

docker-build:
summary: Build Docker Image
Expand Down
4 changes: 2 additions & 2 deletions cmd/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
[package]
name = "cmd"
version = "0.1.3"
version = "0.2.0"
edition = "2021"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

[dependencies]
internal = { path = "../internal", version = "0.1.3"}
internal = { path = "../internal", version = "0.2.0"}
tokio = { version = "1.0", features = ["full"] }
6 changes: 2 additions & 4 deletions cmd/src/main.rs
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
use internal::{self, config::Config, handler::handler};
use internal::app::app;

#[tokio::main]
async fn main() -> std::io::Result<()> {
let config = Config::from_envar();
handler(config).await;
Ok(())
Ok(app().await)
}
7 changes: 2 additions & 5 deletions env.example
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,8 @@ SVC_ENDPOINT="127.0.0.1"
SVC_PORT="8080"
LOG_LEVEL="info"
ENVIRONMENT="dev"
POSTGRES_USER="admin"
POSTGRES_PASSWORD="admin-password"
POSTGRES_DB="testing"
POSTGRES_HOST="127.0.0.1"
POSTGRES_PORT="5432"
DATA_SOURCE=sqlite
DATABASE_URL="sqlite:husni-portfolio.db"
GITHUB_OWNER=husni-zuhdi
GITHUB_REPO=husni-blog-resources
GITHUB_BRANCH=main
8 changes: 6 additions & 2 deletions internal/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
[package]
name = "internal"
version = "0.1.3"
version = "0.2.0"
edition = "2021"
build = "build.rs"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

[dependencies]
axum = "0.7.5"
axum = { version = "0.7.5", features = ["macros"] }
tokio = { version = "1.0", features = ["full"] }
# tracing = "0.1" # Might not need this
# tracing-subscriber = { version = "0.3", features = ["env-filter"] } # Might not need this
Expand All @@ -23,6 +23,10 @@ test-log = "0.2.16"
octocrab = "0.39.0"
http-body-util = "0.1.2"
regex = "1.10.6"
sqlx = { version = "=0.8.1", features = ["sqlite", "runtime-tokio"] }
async-trait = "0.1.81"
dyn-clone = "1.0.17"
# rusqlite = "=0.32.1"

[build-dependencies]
anyhow = "1.0.86"
Expand Down
8 changes: 8 additions & 0 deletions internal/migrations/20240901103916_initial_migration.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
-- Add migration script here
CREATE TABLE IF NOT EXISTS blogs (
id TEXT PRIMARY KEY NOT NULL,
name TEXT NOT NULL,
source TEXT NOT NULL,
filename TEXT NOT NULL,
body TEXT NOT NULL
);
113 changes: 113 additions & 0 deletions internal/src/api/filesystem.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
use crate::model::blog::{
Blog, BlogBody, BlogFilename, BlogId, BlogMetadata, BlogName, BlogSource,
};
use crate::repo::api::ApiRepo;
use crate::utils::capitalize;
use async_trait::async_trait;
use log::{debug, info};
use markdown::{to_html_with_options, CompileOptions, Constructs, Options, ParseOptions};
use std::fs;
use std::path::PathBuf;

#[derive(Clone)]
pub struct FilesystemApiUseCase {
pub blogs_dir: String,
}

#[async_trait]
impl ApiRepo for FilesystemApiUseCase {
async fn list_metadata(&self) -> Vec<BlogMetadata> {
let read_dir = fs::read_dir(self.blogs_dir.clone()).expect("Failed to read dir");
let blogs_metadata: Vec<BlogMetadata> = read_dir
// Collect Blog Filename
.filter_map(|blog_path| {
let blog_path_buf = blog_path.expect("Failed to get blog DirEntry").path();
Self::process_blog_path(&self, blog_path_buf)
})
// Collect Blog Metadata
.map(|blog_filename| Self::process_blog_metadata(&self, blog_filename))
.collect();
blogs_metadata
}
async fn fetch(&self, metadata: BlogMetadata) -> Blog {
let body = Self::process_markdown(metadata.filename.0.clone())
.expect("Failed to convert markdown to html");
debug!("Blog Body with Id {}: {}", &metadata.id.0, &body);

Blog {
id: metadata.id,
name: metadata.name,
source: BlogSource::Filesystem,
filename: metadata.filename,
body: BlogBody(body),
}
}
}

impl FilesystemApiUseCase {
pub async fn new(blogs_dir: String) -> FilesystemApiUseCase {
FilesystemApiUseCase { blogs_dir }
}
/// Process Blog Path from a PathBuf
/// Returned an Option String
fn process_blog_path(&self, blog_path_buf: PathBuf) -> Option<String> {
if blog_path_buf.is_file() {
blog_path_buf
.file_name()
.expect("Failed to get filename")
.to_str()
.map(|str| str.to_owned())
} else {
None
}
}
/// Process Blog Metadata from Blog Filename
/// Returned BlogMetadata
fn process_blog_metadata(&self, blog_filename: String) -> BlogMetadata {
let (id, name_init) = blog_filename
.split_once("-")
.expect("Failed to split filename into id and name");
let name_lower = name_init
.replace("_", " ")
.split_once(".")
.expect("Failed to remove file extension.")
.0
.to_string();
let name = capitalize(&name_lower);
let filename = format!("{}{}", self.blogs_dir, &blog_filename);
info!("Blog Metadata with Id {} has been processed.", &id);
debug!("Blog Name with Id {}: {}", &id, &name);
debug!("Blog Filename with Id {}: {}", &id, &filename);

BlogMetadata {
id: BlogId(id.to_string()),
name: BlogName(name),
filename: BlogFilename(filename),
}
}
/// Process Markdown
/// take String of filename and convert markdown file into html with option
/// return String of converted markdown in html or String of error
fn process_markdown(filename: String) -> Result<String, String> {
let body_md =
fs::read_to_string(filename.clone()).expect("Failed to read markdown blog file");
debug!("Markdown Body for filename {}: {}", &filename, body_md);

let html = to_html_with_options(
&body_md,
&Options {
parse: ParseOptions {
constructs: Constructs {
// In case you want to activeat frontmatter in the future
// frontmatter: true,
..Constructs::gfm()
},
..ParseOptions::gfm()
},
compile: CompileOptions::gfm(),
},
)
.expect("Failed to convert html with options");
Ok(html)
}
}
Loading