Hello, my name is Mohamed, I hold a Bachelor’s degree in Computer Engineering.
Fascinated by the challenge of uncovering hidden insights, I pursued a data analysis bootcamp and honed my skills in Python, SQL and Power BI.
As a data analyst, My goal is to leverage these skills to improve results, make the right decisions and save costs.
Graduated AWS re/start program in November 2022, and obtained AWS Certified Cloud Practitioner Certification.
I'm currently working as a data officer at Collateral Repair Project, My responsibilities include:
Analyzing data (collection, cleaning, analysis, visualization)
Managing and building data pipelines from various sources including databases, APIs, webhooks, Google Sheets, surveys and files
Automation: reports, ETL and web scraping
Building reports, dashboards and presentations
Reconcile monthly transactions to identify and resolve errors ensuring accurate financial reporting
Managing user accounts and access permissions across Google Workspace, including Gmail, Drive, and other essential tools. I ensure secure collaboration by resolving any access issues on shared drives.
Managing and configuring user permissions in our attendance app for accurate data tracking. Additionally, I create and manage VPN profiles for authorized remote access.
As an avid gardener I had to know the best time of year for agricultural activites, for example when grafting citrus trees, the best temperature at which citrus wounds heal is between (21°C and 29°C) according to fruitmentor.com
knowing what the temperature and weather is like where I live gives me an idea in what I can or can't plant, graft, grow. I cant grow mangoes where i live because mangoes are freeze sensitive and require humid and warm weather all year round
The following are the steps I took to make this project from collecting the data to visualizing:
Web Scraping: scraping the historical hourly data from a website usingSelenium
Data cleansing: cleaning, modifying using regular expression, and
converting the data into a CSV file for better handling
Data Visualization: more cleaning and analysis of the data to get the
useful data for visualization and reporting using pandas and Matplotlib
Power BI Dashboard: telling the story through visuals
Weather Analysis Jupyter Notebook
Hover or Click to Enlarge and Scroll Through the Notebook
On April 2021 the data of more than half a billion Facebook user from 106 countries have been leaked, around three million belong to Jordanians, the data was puublished on a hacking forum and Facebook said the data was old, from a previously reported leak in 2019. It has denied any wrongdoing, saying that the data was scraped from publicly available information on the site!
Although the sample is huge it doesn't reflect the real world in some aspects, males are two times the females and there's a pattern for birthdays growing almost exponentially with birthdays that ocuured in 1995 being at the top, maybe there was a pattern in how the data was leaked.
Mohamed is the most common name, which is highly likely considering Mohamed is one of the most common names worldwide, most used carrier is Zain at 46.6%, what i find hard to believe is that Yahoo is the top email domain name.
This repo was made to analyze the data, the data was in a good clean shape mostly, just required some touches and modification.
the exploratory analysis will not show sensitive data like the emails or phone numbers, just a simple analysis like phone carrier,email domain,religion and gender.
scraping different content,data and elements from the web quickly and efficiently, cleaning and organizing the
data in different ways and formats.
Autoamtion, controlling the computer, mouse, keyboard, making a macro-like procedures with minimal human
intervention.
The gif below is an axample of web scraping, it is the execution of this code,
where the program collects all the available data on medicine from Jordan Food and Drug Administration JFDA,
The data includes audience price, hospital price, trade name, concentration, tax, and dealer.
GIF Showing Web Scraping In Action
And the notebook below shows API calling, collecting the data, storing the data in a data frame and lastly
visualizing the data.
API calling Notebook
Contact
Mohamed
m.suwan@outlook.com
Location
Amman, Jordan - Remote Work - Willing to Relocate.
Based on your objectives, we will make a plan to reach the right insight.
I am available for: