blablanonymous
blablanonymous t1_j1fdeb6 wrote
Reply to comment by LimitedConsequence in [P] Regression Model With Added Constraint by rapp17
Ha good point
blablanonymous t1_j1f4wy6 wrote
Reply to comment by LimitedConsequence in [P] Regression Model With Added Constraint by rapp17
You’re suggesting Softmax then normalize to whatever is required. Softmax takes the exponential of the value then performs the normalization. You might not want that exponential. You can create a layer that does the normality through a custom activation function you use in the last layer
blablanonymous t1_j1dp14o wrote
Reply to comment by LimitedConsequence in [P] Regression Model With Added Constraint by rapp17
If you just want to normalize everything why creating a custom activation function that just does that?
blablanonymous t1_j1aacs3 wrote
Reply to comment by rapp17 in [P] Regression Model With Added Constraint by rapp17
It sounds more like constrained optimization than ML but still too vague of an explanation for me to be helpful so I’m giving up. Good luck
blablanonymous t1_j1a5ovy wrote
Reply to comment by Sir-Rhino in [P] Regression Model With Added Constraint by rapp17
Maybe give an example? Maybe I’m just slow but I think it’s ambiguous the way it’s phrased
blablanonymous t1_j0dsd7h wrote
Yeah just use logistic regression
blablanonymous t1_izx7gp9 wrote
Reply to comment by [deleted] in [D] - Has Open AI said what ChatGPT's architecture is? What technique is it using to "remember" previous prompts? by 029187
Decision tree of life
blablanonymous t1_izjz5vb wrote
Reply to comment by cantfindaname2take in [D] Product Recommendation Algorithm by RstarPhoneix
Fair
blablanonymous t1_izj3040 wrote
Reply to comment by cantfindaname2take in [D] Product Recommendation Algorithm by RstarPhoneix
Isn’t the simplest version of what I described essentially matrix factorization?
blablanonymous t1_izixlxm wrote
Reply to comment by fragilistical in [D] Product Recommendation Algorithm by RstarPhoneix
Hmm neural networks can be arbitrarily small. You can model logistic regression with a neural net. You could definitely try a simple neural network with one embedding layer for users and one for products and use a sigmoid activation for the output layer. Then you need to combine these layers in some ways, for instance a simple dot product or concatenation. You can start with no hidden layer and add more if the results are not satisfactory.
blablanonymous t1_iz9rb39 wrote
Reply to [D] If you had to pick 10-20 significant papers that summarize the research trajectory of AI from the past 100 years what would they be by versaceblues
That’s a good question for Galactica
blablanonymous t1_iyey3oo wrote
Reply to comment by currentscurrents in [D] Other than data what are the common problems holding back machine learning/artificial intelligence by BadKarma-18
Chill out. Obviously ML is useful. The biggest companies in the world are only this big because of ML. I’m just saying that in my experience, many companies think they need it when they really just need data engineers and data analysts
blablanonymous t1_iychp9l wrote
Reply to comment by Phoneaccount25732 in [D] Other than data what are the common problems holding back machine learning/artificial intelligence by BadKarma-18
Actually having a use case
blablanonymous t1_ivif2da wrote
Reply to comment by ObjectManagerManager in [D] At what tasks are models better than humans given the same amount of data? by billjames1685
We’re also constantly iterating on the learning algorithm. We learn to learn. That’s one of the most important skill we learn throughout our education. Computer need to be taught how and what to learn for the most part
blablanonymous t1_iuvr8jt wrote
I’m curious why this is a useful problem to solve. What is an application for this? I also, after looking for a minute, I cannot find more than a few instances of a fiber that goes under another fiber AND comes back up again. I think I would try non DL/classic segmentation techniques
blablanonymous t1_iue08q3 wrote
Reply to Ozone Hole Continues Shrinking in 2022, NASA and NOAA Scientists Say | Annual Antarctic ozone hole over the South Pole was slightly smaller than last year and generally continued the overall shrinking trend of recent years. by yourSAS
The one good news about the direction this planet is going?
blablanonymous t1_itf1o9u wrote
Reply to comment by sugar_scoot in [P] is it necessary to convert audio data from analog to digital? by SSC_08
I really hope it’s the latter
blablanonymous t1_isytbad wrote
Reply to comment by stevewithaweave in [D] GPT-3 is a DREAM for citation-farmers - Threat Model Tuesday #1 by TiredOldCrow
🤣😂
blablanonymous t1_isystb6 wrote
Reply to comment by stevewithaweave in [D] GPT-3 is a DREAM for citation-farmers - Threat Model Tuesday #1 by TiredOldCrow
But you wouldn’t you need to have a set of real papers you’re actually very confident they are real?
blablanonymous t1_iswe0za wrote
Reply to comment by gravitas_shortage in [D] GPT-3 is a DREAM for citation-farmers - Threat Model Tuesday #1 by TiredOldCrow
Won’t you need a labeled training set to make that work?
blablanonymous t1_isntav6 wrote
Reply to [D] If implementing any cognitive function was as easy as making an API call, what would you develop? by ImportanceDecent92
I would create a conversational AI that can be used as 24/7 therapist. It would talk you out of feeling like crap on the short term and help break whatever unhealthy behavioral pattern you’re trapped into. Help you make new healthy habits etc.. not sure what exact cognitive functions are required, more than one though and I feel like it can be seen as a reinforcement learning problem? You’d need a way to assess one’s mental state, and then define actions (just say things to the subject) that would make you feel better but also challenge you enough to improve your state on the long term
blablanonymous t1_j1msyd5 wrote
Reply to [D] The case for deep learning for tabular data by dhruvnigam93
Can’t you also create custom loss functions for XGBoost? I’ve never used it myself but it seems as easy as doing it for an ANN
Is it always trivial to get meaningful embeddings? Does taking the last hidden layer of ANN guarantee that representation will be useful in many different contexts? I think it might need more work than you expect. I’m actually looking for a write up about what conditions needs to be met for a hidden layer to provide meaningful embeddings. I think using a triplet loss intuitively favors that but I’m not sure in general.
XGBoost allows for this too, doesn’t it? The scikit-learn API definitely at least let’s you create MultiOutput models very easily. Granted it can be silly to have multiple models under the hood but whatever works.
Sorry I’m playing devil’s advocate here, but the vibe I’m getting from your post is that you’re excited to finally getting to play with DNN. Which I can relate to. But don’t get lost in that intellectual excitement: at the end of the day, people want you to solve a business problem. The fastest you can get to a good solution the better.
In the end it’s all about trade offs. People who employ you just want the best value for their money.