This project is part of my Elevvo Internship (Data Analytics track).
The objective was to scrape job listings from a real website and analyze the data to identify the most common job titles and in-demand skills.
- Source: Real Python Fake Jobs Website
- Data includes job details such as title, company, location, and date posted.
- Python
- Pandas
- Matplotlib
- Seaborn
- BeautifulSoup
- Requests
- Jupyter Notebook
- Web Scraping – Fetch job postings using BeautifulSoup and Requests
- Data Cleaning – Remove duplicates, extract city from location
- Analysis – Identify top job titles and most in-demand skills
- Data Visualization – Create clean bar plots for top job titles and skills
- Insights & Conclusion – Interpret results
- Most common job title: Python Programmer (Entry-Level)
- Unique companies: 99
- Top skills overall: Python, Django (based on job title keywords)
- Install dependencies:
pip install -r requirements.txt - Open the notebook:
jupyter notebook Job_Postings_Analysis.ipynb

