Strange learning curves

  • 11 Replies
  • 4490 Views
*

Kaeldric

  • Guest
Strange learning curves
« on: February 13, 2018, 11:20:24 pm »
Hello.
Recently i developed a a linear regression algorithm and I wanted to test it plotting it's learning curves.
I took some datasets from UCI (https://archive.ics.uci.edu/ml/datasets.html) and started draw some curves.
After a while, for some combinations of parameters, the curve of the learning set and the cross validation set one crossed each other!
After that they seems to continue their journey: the learning curve continued to raise and the CV curve continued to go down.  :-\

I studied that those curves should saturate to two near values or (in high BIAS cases) they should saturate to distant values ... but never cross each other.  :o

Does anyone know if this is a normal behavior?

Thank you.  :)


*

keghn

  • Trusty Member
  • *********
  • Terminator
  • *
  • 824
Re: Strange learning curves
« Reply #1 on: February 14, 2018, 01:40:54 am »

 Are talking about something like a  Neural network fits noisy standing sine wave:

https://www.youtube.com/watch?v=OPiwyYRpKtA

*

ranch vermin

  • Not much time left.
  • Terminator
  • *********
  • 947
  • Its nearly time!
Re: Strange learning curves
« Reply #2 on: February 14, 2018, 02:22:22 am »
All I know about this,  thats sorta like it,  is take reading x as the address,  and plot reading y - and youve got a transfer array,  the base building block of computer memory systems.

I like it,  but I havent ever "evolved" it, or anything, i just make direct assignments in my work.


[EDIT]   ur symantics might be completely non correlating at all,  that would cause that behaviour

u at least need some kind of proportional relationship,   like texture density and z distance, or it wont work at all.


thanks keghn for that video, i wasnt quite sure what i was reading...
[EDIT]

« Last Edit: February 14, 2018, 04:52:42 am by ranch vermin »

*

keghn

  • Trusty Member
  • *********
  • Terminator
  • *
  • 824
Re: Strange learning curves
« Reply #3 on: February 14, 2018, 03:17:14 am »
 My notes, Cross Validation:
https://www.youtube.com/watch?v=sFO2ff-gTh0

*

Kaeldric

  • Guest
Re: Strange learning curves
« Reply #4 on: February 14, 2018, 11:53:53 am »
My notes, Cross Validation:

Thank you, it was very instructive. It is, by the way, almost the same procedure I used.
The only difference id that I didn't use folds. I extrapolated a training set and a cross validation set from the original data set and plotted the MSE from the two sets as a function of the number of examples of the training set used in the learning step.
The only thing that seemed strange to me is that after a particular value of the examples the algorithm calculate better hypothesis  from the cross validation set than the training set.
I ask myself if this is possible behavior or if there is something wrong in the procedure I implemented.
« Last Edit: February 14, 2018, 06:06:02 pm by Kaeldric »

*

Kaeldric

  • Guest
Re: Strange learning curves
« Reply #5 on: February 15, 2018, 12:46:21 pm »
I elaborated a different set of examples and I plotted "better" curves this time.
Is there possible that I found a couple of data set that I wasn't able to model with a simple polynomial regression?

By the way, this time I notice that increasing the degree of the polynomial  the MSE of the training set increases. I expected to see it falling because the polynomial is overfitting the data.

Do you think that is there something wrong or it's me too much confident on the theory?

*

Korrelan

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1454
  • Look into my eyes! WOAH!
    • YouTube
Re: Strange learning curves
« Reply #6 on: February 15, 2018, 01:26:35 pm »
Hi Kaeldric

Is the linear regression algorithm you are using based closely on an existing methodology or is a ‘new’ approach?

You extrapolated you initial training set, then got better results with the same algorithm and a second extrapolated/ elaborated set, to me this points to an error in the training data, or the operational range of the algorithm.

If you are trying to improve on an existing method, I would suggest you first run your own extrapolated training data through an existing known algorithm just to check the validity of data sets.

 :)
It thunk... therefore it is!...    /    Project Page    /    KorrTecx Website

*

keghn

  • Trusty Member
  • *********
  • Terminator
  • *
  • 824
Re: Strange learning curves
« Reply #7 on: February 15, 2018, 02:39:29 pm »

After that they seems to continue their journey: the learning curve continued to raise and the CV curve continued to go down.  :-\



 The your regression line should go travel in between the high bumps, points, and low bump of the original line.

Least squares:   
https://en.wikipedia.org/wiki/Least_squares     

 I like to use a traveling window algorithm and find the average sum within the window.
Window Sliding Technique: 
https://www.geeksforgeeks.org/window-sliding-technique/ 


*

Kaeldric

  • Guest
Re: Strange learning curves
« Reply #8 on: February 15, 2018, 02:42:39 pm »
Hi korrelan.
I'm using a simple Octave implementation of linear regression based on gradient descent methodology to minimize the cost function.
It is a prototype I used to get a machine learning certification recently and I am quite sure it is doing is job.
But you are right ... maybe I try to check the implementation again or try something more "tested".
Can you suggest me a known implementation I can use?

Thank you.

*

Kaeldric

  • Guest
Re: Strange learning curves
« Reply #9 on: February 15, 2018, 02:51:13 pm »
The your regression line should go travel in between the high bumps, points, and low bump of the original line.

I'm very sorry but I don't understand the meaning, probably because my little broken english. Can you explain please?

By the way I think I used the same approach you suggested, in fact my cost function has the form:

C = (1/2m)*summ(h(x)-y)2

where m=the number of elements in the set
h(x) the hypothesis given input element x
y the real value I should get with x

*

keghn

  • Trusty Member
  • *********
  • Terminator
  • *
  • 824
Re: Strange learning curves
« Reply #10 on: February 15, 2018, 04:41:55 pm »
 You are using the "root mean square" equation?
 Might be better if you use the "least mean" equation?

The in between line of the high and low points. Or in between the high and low points of a wavy line. The fitted line: 
https://en.wikipedia.org/wiki/Linear_least_squares_(mathematics)#/media/File:Linear_least_squares_example2.svg 


Linear Regression - Least Squares Criterion Part 1:
https://www.youtube.com/watch?v=0T0z8d0_aY4

 I like smoothing algorithms. 

Savitsky-Golay Smoothing: 
http://www.ipredict.it/methods/SavitskyGolay/ 


Smoothing spline: 
https://www.google.com/search?q=Smoothing+spline&client=ubuntu&source=lnms&tbm=isch&sa=X&ved=0ahUKEwjAxd6MrKjZAhUKqFQKHfBtDAE4ChD8BQgLKAI&biw=1195&bih=614#imgrc=VeRT3ekf2RU3MM:


*

Kaeldric

  • Guest
Re: Strange learning curves
« Reply #11 on: February 16, 2018, 06:28:14 pm »
You are using the "root mean square" equation?
 Might be better if you use the "least mean" equation?

Obviously this didn't change the general curve plot, but the values give me a more intuition about the real error rate.
Thank you for the suggestion!  :)

 


OpenAI Speech-to-Speech Reasoning Demo
by ivan.moony (AI News )
March 28, 2024, 01:31:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am
LLaMA2 Meta's chatbot released
by 8pla.net (AI News )
October 18, 2023, 11:41:21 pm

Users Online

307 Guests, 0 Users

Most Online Today: 396. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles