Data
Prioritization for Data Work
Prioritization in data work is the selection of tasks that are both innovative and impactful. Adopting a scientific framework, including first principles approach and second order thinking, can help with breaking down problems and understanding their complexities. Consistency and documenting evidence enhances prioritization methodologies over time. Avoid focusing only on reporting or engaging in irrelevant tasks for optimal results.
Learn Prioritization for Data Work with the Practica AI Coach
The Practica AI Coach helps you improve in Prioritization for Data Work by using your current work challenges as opportunities to improve. The AI Coach will ask you questions, instruct you on concepts and tactics, and give you feedback as you make progress.Curated Learning Resources
- Prioritizing Data Science WorkAs a data scientist, you are constantly deciding what tasks to prioritize. There are many requests from stakeholders but not all have the same impact or innovativeness. Jacqueline recommends prioritizing projects that are both innovative and impactful as they have the greatest potential to change the business. Projects that are not innovative but still provide useful proof can also be valuable. Jacqueline advises against getting stuck doing interesting but irrelevant work or only reporting, as these contribute less to the company. Data scientists should aim to do work that both affects the company and is innovative.
- Prioritising the Scientific WayShyam proposes a scientific framework for prioritization consisting of a first principles approach and second order thinking. The first principles approach involves breaking down problems into fundamental components to remove biases. Second order thinking considers the consequences of consequences to uncover hidden impacts and complexities. Shyam outlines four types of complexities - structural, technical, temporal and directional. Complications are distinguished from complexities. A scientific prioritization process involves documenting evidence to improve the methodology over time. Consistency is key to allow positive effects to compound, and documentation helps transfer Accountability to the process rather than individuals.
Related Skills
- ROI for Data Work
- MLOps Platforms
- Structuring Data Teams
- Effective Dashboards
- SQL
- Machine Learning
- Data Science Career Ladders
- Data Engineering
- Neural Networks
- Analytics
- Analysis Documentation
- Data Infrastructure
- Cohort Analysis
- Data Tools
- ETL
- Data Soft Skills
- Data Dictionary
- Data Governance
- Data Roadmaps
- Event Data
- Personalization
- OCR
- Data Warehouse
- Deep Learning
- Sampling Algorithms
- Data Intuition
- Linear Regressions
- Data Cleaning