Recently while creating my personal portfolio website abulasar.dev, I added a blog
page. For that, I had fetched hashnode blogs on the page. Hashnode has exposed its blog through API. You can visit api.hashnode.com
and this API is in Graphql format. Previously, I had fetched these blogs in an Ember.js app and wrote a blog on this topic. I had no previous experience using graphql client in the elixir projects. I implemented it in this project and this blog will be about using graphql query in the Phoenix LiveView project. Let's build one!!
Action Plan
- Creating LiveView project
- Installing and configuring graphql client.
- Adding API layer to fetch data.
- Integrating API layer with UI
Let's start building ๐ท
- First of all, as discussed will create a LiveView project.
- We will confirm if Elixir is installed by running
elixir -v
. - Once confirmed we need to create/generate the Phoenix LiveView application.
- This can be done by running
mix phx.new fetch_hashnode --no-ecto --live
and this will generate one. - We will then navigate to the project directory by running
cd fetch_hashnode
and runmix phx.server
. Once, the project starts running navigate tolocalhost:4000
to see theWelcome
phoenix screen.
Installing Graphql client
- I searched for Graphql client in Elixir and came across a few but the one that caught my eye is Neuron. It is very easy to configure and use so I decided to go with
Neuron
. - We will configure
Neuron
by first adding it to the list of dependencies. - Open the
mix.exs
file and add Neuron to the list of dependencies.def deps do [ ........, {:neuron, "~> 5.0.0"} ] end
- Now run
mix deps.get
to install the dependency. - Next, we are going to configure
Neuron
.
Configure Neuron
- To query Hashnode API, we have to add a
url
in the Neuron configuration. - As per documentation we can test Neuron in
iex shell
by running the following query.
iex> Neuron.Config.set(url: "https://api.hashnode.com/")
- After adding this you can query Hasnode API as follows
Neuron.query("""
{
user(username: "your_username") {
publication {
posts(page: 0) {
title
brief
slug
cuid
coverImage
}
}
}
}
""")
- This query will return a
Neuron
response with a list of blogs on page number 0, along with the nitty-gritty details ofheader
responses, something like below.
{:ok,
%Neuron.Response{
body: %{
"data" => %{
"user" => %{
"publication" => %{
"posts" => [
%{ } // blog 1,
%{ } // blog 2
]
}
]
}
}
}
},
headers: [
{"Connection", "keep-alive"},
{"Content-Length", "3186"},
......
],
status_code: 200
}
}
- So far so good, everything is looking fine. In our next step, we'll add an API layer where we'll place the above query to fetch the data.
Adding API layer to fetch data
- We will add a module that will have all two functions one to fetch all blogs and one to fetch specific blogs.
- Create a file a create a folder
blog_posts
underlib/fetch_hashnode
. In that, we'll add theblog.ex
file. Add the following code to it.
defmodule FetchHashnode.Blog do @username "your_hashnode_username" def get_blogs(page \\0) do Neuron.query(""" { user(username: "#{@username}") { publication { posts(page: #{page}) { title brief slug cuid coverImage } } } } """) end def get_detail_blog(slug) do Neuron.query(""" { post(slug: "#{slug}", hostname: "#{@username}") { title, slug, cuid, coverImage, content, contentMarkdown, tags { name } } } """) end end
- The above two functions are self-explanatory. It is simple and very similar to what we have tried earlier in the console.
- Let's fire up the
iex
shell by runningiex -S mix phx.server
and do some testing of these functions. - We'll first test the
get_blogs
function which expects to return a list of all blogs on page 0 (because the default is 0). - Run the following query
iex> FetchHashnode.Blog.get_blogs
- We will get the error
" you need to supply an url"
. So, what went wrong? ๐ค - If you'll remember we registered the
url
with theNeuron
in the shell earlier by doing something like this.Neuron.Config.set(url: "https://api.hashnode.com/")
- We want
Neuron
should be aware of thisurl
as soon as the server starts. So, we will add the above snippet inapplication.ex
in thestart
function.
def start(_type, _args) do
children = [
FetchHashnode.Repo,
....
]
opts = [strategy: :one_for_one, name: FetchHashnode.Supervisor]
Neuron.Config.set(url: "https://api.hashnode.com/") # This is the line
...
end
- Now, again restart the server
iex -S mix phx.server
and again query the functionFetchHashnode.Blog.get_blogs
and voila!! it's working now. - Similarly for the second function i.e
get_detail_blog
we need to pass theslug
of the blog. So, we will test the function with someslug
like this
FetchHashnode.Blog.get_detail_blog('writing-mix-task-in-elixir-phoenix')
- This will return a detailed version of the blog.
Integrating API layer with UI
- Now, that we have a working API layer. We will add a
LiveView
page to display these fetched blogs. - Add
blogs
endpoint inrouter.ex
by adding the following code
scope "/", FetchHashnodeWeb do
pipe_through :browser
live "/blogs", BlogsLive
end
Now add a file
lib/fetch_hashnode_web/live/blogs_live.ex
and add the followingmount
callback in the code.defmodule FetchHashnodeWeb.BlogsLive do use FetchHashnodeWeb, :live_view alias FetchHashnode.Blog def mount(_params, _session, socket) do {:ok, blogs} = Blog.get_blogs() blogs = get_in(blogs.body, ["data", "user", "publication", "posts"]) socket = assign(socket, blogs: blogs ) {:ok, socket} end end
- We have aliased the
Blog
module and queried theget_blogs
function in the mount function. - We have assigned the fetched blogs to the
blogs
variable using pattern matching. - Later we have extracted all the blogs and assigned them to
socket
this will make it available in the view in therender
callback. - We will loop over the
blogs
variable in the view to display something like this.def render(assigns) do ~L""" <%= for blog <- @blogs do %> Title: <%= blog["title"] %> <hr> <% end %> """ end
- It will look something like below
- Similarly, we can fetch a particular blog and display it on a separate
LiveView
page. I'll leave this logic up to you ๐.
I hope you like this blog. Of course, there is some scope of refactoring that I intentionally didn't touch. If you have any questions then please comment below. Thanks for reading ๐.