Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Data Alchemy

Public • 23.7k • Free

10 contributions to Data Alchemy
5 Lessons Learned in 2023 from AI Projects
I've been working with Large Language Models (LLMs) for a year now. Creating apps for businesses ranging from SME's to Enterprises. Here are 5 lessons I learned in 2023: 1. Quality data is still the key to success 2. Start with internally-facing applications (lower risk) 3. Transitioning from PoC to production is challenging 4. Implement LLM Operations (LLMOps) from day one 5. Be prepared to change your architecture every quarter What's your biggest lesson from learning or working with AI this year? P.S. Are we already connected on LinkedIn?
35
16
New comment Dec '23
5 Lessons Learned in 2023 from AI Projects
4 likes • Dec '23
@Marco Bottaro great question. I'm facing some of these challenges right now as we speak
The Job Seeker project and other odds and sods
Hey all, reporting back with news about our project: Ana, Brandon and I have been busy preparing the data pipeline from scraping to analysis: - scraping we search for a specific job title on LinkedIn (e.g., Data Engineer) and scrape X number of search results pages. we save the resulting data (e.g., title, location, description, etc.) into a json file with help of the pydantic module - skills extraction we prompt the LLM to extract required skills from the job descriptions. again with the help of pydantic, we save the resulting data into a json file. - skills clustering the resulting json is far too detailed to be useful, so we prompt the LLM to evaluate the skills and identify and name clusters of skills. pydantic, json, file. - data processing we have taken a first shot at processing this data: dimensionality reduction, association rule mining, network analysis. we are just at the beginning, but it's super inspiring that we got this far in our own project. it's too early to report findings, but once we find something interesting, we'll let you know :) I recommend everybody to find like-minded community members and embark on projects where you can contribute, learn and co-operate. It's motivating to know that you are on a joint endeavour. Speaking of which: thanks to Zachary, our team was able to reach out to an organisation in East Africa that helps farmers turn to organic farming. We had our first meeting yesterday and learnt so much about their objectives and the problems they face. We are brainstorming on what project we can design so we learn, collaborate and contribute. Ana is extremely talented in bringing people together and materialising opportunities. It's thanks to her enthusiasm that we got this far with the organic farming project! Over and out, back to enjoying ourselves on our self initiated projects :)
8
7
New comment Dec '23
The Job Seeker project and other odds and sods
4 likes • Dec '23
This is such an amazing team work.
How do you feel?
Dave is a great instructor, but the 95% of the group is in level 1 or 2, so they can't get access to lock content and Would like to know How do you feel about it? It seems pretty difficult to me to get 20 likes. :)
21
23
New comment Dec '23
5 likes • Dec '23
@Eric Castel and I just went through ever of your post and liked it and got you 20+ likes. Happy? :-)
3 likes • Dec '23
@Olu Akin I think this is such good replay Olu. Thank you for saying this.
Anaconda Environments Creation Strategy
A question to experienced Anaconda users. Q: Do you create Anaconda environment for each individual project that you are working on or you have some other strategy to create the environments.
7
5
New comment Dec '23
5 likes • Dec '23
Yes, the purpose of environment is to encapsulate your project and its variables. If you are building scraper, you might use beautiful soup, selenium etc, you won't need these libraries if you are doing linear regression analysis while accessing to your data in spreadsheet or in database. So, you would pick the project, set up environment and then install only libraries you need. Also, when you create your requiraments.txt file you want to capture only libraries relevant for that project. There is no point cluttering environment with unnecessary libraries and also passing them to requirements.txt file for others who might install it and get cluttered env. One last thing, when you are pushing your repo to Github, make sure you are not pushing your environment. Whoever clones your repo he/she can easily set up his interpreter with env. and just installed your requirements.txt
4 likes • Dec '23
@Shanice Williams Thats a good point Shanice. I have one as well which is just as playground. However, I sometimes get conflict of libraries. Especially with NumPy where it is not compatible with other libraries version and such.
For All the New People - LLMs
I'm new here and I see there are a lot of others as well. Some of you have mentioned an interest in LLM. Here is a quick 5 minute clip that explains them really clearly and simply, I hope this will be useful to you. https://www.youtube.com/watch?v=5sLYAQS9sWQ
5
5
New comment Dec '23
For All the New People - LLMs
1 like • Dec '23
@Dejan M I was looking into getting one of these glass boards. These things are insanely expensive.
2 likes • Dec '23
@Dejan M - Yes please. :-)
1-10 of 10
Cody Algorithm
3
19points to level up
@cody-algorithm-9877
I am Python developer, Data Scientist, I play with pandas... I cook beautiful soup... I code with rhythm of algorithm... Come check my alchemy.

Active 359d ago
Joined Dec 9, 2023
powered by