Kids

The weird thing about kids is the appropriate compartment to put them in. I like the little people. They’re usually fun, generally nice, and receptive to a kind word. I enjoy being around them.

But we’re not friends. Our interations are very much dominated by the facts that I’m an adult and they’re not. Part of that is what makes dealing with them pleasant. Most are highly receptive to a kind word or a bit of encouragement from an adult. Then they do something infantile, like just scream, and force me to mind the gap.

So these tiny, pleasant little people are usually nice and fun to be around, and I’m never quite sure how to talk to them. Like, some people run into problems with adult subjects, but the adult subjects I slip on aren’t the ones you think of.

“Hey, kiddo. How’s your portfolio doing with the market tanking like this? Increasing your bond allocation or holding firm? Take your finger out of your nose.”

“Getting on that benchpress, tiger? Putting up weight? Good, good. Now go wash your hands.”

The solution is to listen, and that’s what I try to do. But listening gives a feeling of intense cognitive dissonance as I think to myself, “I like this little person,” while I’m saying, “Seriously, go wash your hands. Use soap.”

SEiA

I’m trying to find a way to set-up Sauron and Bella from Twilight. The problem is, they may be too perfect for each other.

News

Imagine saying this sentence a year ago. Imagine it any time in the last decade.

The market rallied today on the news that with inflation at 9.1%, the Fed might only raise rates by 75 basis points.

Sequence Training for Neural Networks in Matlab

I had 77 pairs of sequences and sequence responses in Matlab. I had two cell arrays, sequences and responses, of dimension 77×1. Each cell held a 10×10,000 array. I created options, layers, and hyperparamaters, and executed

[net, info] = trainNetwork(sequences, responses, layers, options);

The network trained. Things worked.

I got more data, hundreds of pairs. I could still train the network, but I was rapidily coming to the memory limit on my HPC. I wanted to use datastores.

S(cripts) 1
save(‘sequences.mat’,’sequences’);
save(‘responses.mat’,’responses’);

S2
AData = fileDatastore(‘sequences.mat’,’Readfcn’,’@load’);
BData = fileDatastore(‘responses.mat’,’Readfcn’,’@load’);
CData = combine(AData, BData);

… %stuff
[net, info] = trainNetwork(CData, layers, options);

Error using trainNetwork (line 184)
Invalid training data. Predictors must be a N-by-1 cell array of sequences, where N is the number of sequences. All sequences
must have the same feature dimension and at least one time step.

Error in S1 (line ##)
[net, info] = trainNetwork(CData, layers, options);

Or in English, it did not work.

Using preview, I got this:

ans =

1×2 cell array

{1×1 struct} {1×1 struct}

First, the load function creatues a struct. I needed a de-struct-ing function.

Now, I got this.

preview(CData)

ans =

1×2 cell array

{259×1 cell} {259×1 cell}

Second, the combine function creats another cell array, meaning I had a 1×2 cell array (CData), each cell holding a 200×1 cell array (AData and BData), but those were holding lesser datastores.

The solution to the latter was saving each cell as an individual file with one vairable per file, that variable being the 10×10,000 array, NOT A CELL.

S3
%file manip, mkdir, addpath, etc.

for n=1:length(sequences)
sequence1 = sequences{n,1};
response1 = responses{n,1};
save(strcat(‘sequences’,string(n),’,mat’),’sequence1′);
save(strcat(‘responses’,string(n),’,mat’),’sequence1′);
end

AND THEN running S4

%file manip, preprocessing, etc.

getVarFromStruct = @(strct,varName) strct.(varName);
xds = fileDatastore(“sequences*.mat”,”ReadFcn”,@(fname) getVarFromStruct(load(fname),”sequence1″),”FileExtensions”,”.mat”);
yds = fileDatastore(“responses*.mat”,”ReadFcn”,@(fname) getVarFromStruct(load(fname),”response1″),”FileExtensions”,”.mat”);

%options, layers, hp, etc.

[net, info] = trainNetwork(CData, layers, options);

And it worked.

In the preview of CData up there, it creates 2 cell arrays OF CELLS. TrainNetwork doesn’t want cells; it wants data. So the extra layer of cells caused all those errors.

That took me weeks, and someone else had to explain it.

The Struggle

It’s a struggle to find positive things to say right now. But that makes the struggle even more important.

My Unicomp buckling-spring keyboard is frankly incredible. It is better than any other keyboard I have ever used. I can’t use it at work, because it is kinda loud. At home, it is mechanical perfect. This is the keyboard of the Clockwork Gods.