Hi, I'm Fornesus

Buenas at Kumusta, I'm Fornesus and I build experiences and solve problems using code, AI, design, and resource available at my disposal.

Genocide Data Portfolio

I created the genocide data portfolio to shed light on data capturing the sheer inhumanity associated with different conflicts across the world which amount to genocide.

From the ICE deportation crisis in the U.S. since the beginning of the second Trump administration in 2025 to the uneven Gaza "War" which began after the October 7 attacks, the world has seen the human toll of reckless ideologies which promote inhumanity.

Such ideologies are a direct effect of political polarization which has bubbled up as a result of the prominence of social media algorithms in shaping the minds of its users.

My hope for this set of dashboards is, not only to serve as a professional portfolio of data products which make news articles and data on genocidal events across the world more accessible to a broader audience, but also to tangibly help users understand the intersections between different conflicts across the world. These products also serve to show the negative impacts of the world remaining silent on the plight of peoples subject to genocidal conflicts across the world, especially as numerous internal conflicts in many nations essentially amount to genocide which is still often assumed to be a rare and exclusive term when, in material reality, it is more common than we believe it to be.

Finally, as a Filipino American, I consider the fact that Filipino scholar, Luzviminda Francisco, discovered the deflation of figures regarding the American genocide of the Philippines, which occurred during the first years of the Philippine-American War (1898 - 1913), especially negatively impacting the native lands of the Agta and Southern Tagalog peoples in Southern Luzon. Though official American accounting for the number of civilian deaths during this conflict still officially stands at 250,000, Luzviminda Francisco uncovered documentation, and other accounts, in the 1960's and 1970's (well after the end of both this conflict and the formal American colonization of the Philippines as a whole) which proved that these official figures were a gravely negligent undercount, with the scope of deaths actually amounting to 1.4 million people across the Philippines (or 1/6th of the total native population of the Philippines).

This context presents, to me, a deontological moral imperative to use the skills that I have been learning in uncovering, manipulating, and gaining insights from data to shed light on the plight of those facing genocide across the world to ensure data transparency for their respective plights, especially those in the Global South where a lack of accountability and transparency (as well as immoral obfuscation of the facts on the ground) often result in instances, like in the Philippines, where the true scope of the human toll of conflict is either lost to history or discovered far too late for true justice to be given to the direct victims of such atrocities.

For the best results, use full screen mode when viewing this Sway in desktop mode (or using a mouse otherwise).

Fornesus - Coded Art

This Jupyter notebook reviews the steps that I took in creating my first web scraping project, using BeautifulSoup4 and Requests Python libraries to scrape data from three of my websites. I also went through the ways in which I integrated GitHub Copilot in this workflow, though I will state that a solid foundation in Python was still necessary for me to refactor this code as efficiently as possible. You can find the associated Colab Notebook at this link.

Links to a live web output of each of the three tests can be found at the following links:

The associated website was created using the Flask template at Replit and Replit's native AI feature. You can find the codebase for this site at this link.

Fornesus - Coded Art

This Jupyter notebook reviews the steps that I took in creating my first web scraping project, using BeautifulSoup4 and Requests Python libraries to scrape data from three of my websites. I also went through the ways in which I integrated GitHub Copilot in this workflow, though I will state that a solid foundation in Python was still necessary for me to refactor this code as efficiently as possible. You can find the associated Colab Notebook at this link.

Links to a live web output of each of the three tests can be found at the following links:

The associated website was created using the Flask template at Replit and Replit's native AI feature. You can find the codebase for this site at this link.

Fornesus - Coded Art

This Jupyter notebook reviews the steps that I took in creating my first web scraping project, using BeautifulSoup4 and Requests Python libraries to scrape data from three of my websites. I also went through the ways in which I integrated GitHub Copilot in this workflow, though I will state that a solid foundation in Python was still necessary for me to refactor this code as efficiently as possible. You can find the associated Colab Notebook at this link.

Links to a live web output of each of the three tests can be found at the following links:

The associated website was created using the Flask template at Replit and Replit's native AI feature. You can find the codebase for this site at this link.

Exhibit Fornesus

This Jupyter notebook reviews the steps that I took in creating my first web scraping project, using BeautifulSoup4 and Requests Python libraries to scrape data from three of my websites. I also went through the ways in which I integrated GitHub Copilot in this workflow, though I will state that a solid foundation in Python was still necessary for me to refactor this code as efficiently as possible. You can find the associated Colab Notebook at this link.

Links to a live web output of each of the three tests can be found at the following links:

The associated website was created using the Flask template at Replit and Replit's native AI feature. You can find the codebase for this site at this link.

Fornesus Photography

This Jupyter notebook reviews the steps that I took in creating my first web scraping project, using BeautifulSoup4 and Requests Python libraries to scrape data from three of my websites. I also went through the ways in which I integrated GitHub Copilot in this workflow, though I will state that a solid foundation in Python was still necessary for me to refactor this code as efficiently as possible. You can find the associated Colab Notebook at this link.

Links to a live web output of each of the three tests can be found at the following links:

The associated website was created using the Flask template at Replit and Replit's native AI feature. You can find the codebase for this site at this link.

Tanaga Poetry Generator

This is my first Jupyter Notebook which I created using the instructions provided by Codecademy's guidelines and template. You can find the associated Colab Notebook at this link.

Skills Certificates

We have made quality our habit. It’s not something that we just strive for – we live by this principle every day.