Skip to main content

Posts

Showing posts from 2017

Rei - Blog Post 4 & 5

Due to external factors, I accidentally missed the last Blog Post, however I shall make up for it by writing two posts in one! The last couple of weeks have been at least a little productive. Blog Post 4 Over these weeks, Matthew and I were both working on solidifying and exploring the context portion of our project. Of course, we have been exploring the context since nearly the beginning of the year, but because of both capstone classes have asked us to write something about the context of our paper. The context of our project can be found throughout our previous blog post, but a short and summarized version of out context section boils down to a couple main points.  Neural Networks are not too present in video games. When neural nets are present, they are used for research or marketing. As for the rest of our Design Document, unfortunately we rushed it a bit. We got caught up in some upsetting events these weeks and allowed our work to suffer instead of staying on task.

Matthew - Capstone Blog Post 5

Soliciting data from the community It's official. We shilly-shallied about it for months, but now we have finally settled on Rivals of Aether as our training platform. On November 25th, I made a thread on r/RivalsOfAether titled Looking for replay files to use in machine learning research . I honestly was not sure what kind of response to expect. I had only learned about RoA 's existence, I would estimate, sometime around mid-October. Rei pitched it to me several times as a viable alternative to Doom and Quake for machine learning, citing its ability to record input data from matches in plaintext. They even bought me a copy towards the end of October, which featured in my blog post about setting up SerpentAI. The plaintext replay files are certainly an attractive prospect when compared to the binary demo files found in id shooters. Furthermore, the game itself is stylish and fun. I mean, just look at Orcane! Source Nonetheless, I was wary of the idea of using it as

Matthew - Capstone Blog Post 4

Finally, our CSI-480 (Advanced Topics: AI) course material is catching up to where we need to be. We are covering perceptrons and sigmoid neurons in the lectures, and we are also using TensorFlow to solve some very simple introductory problems (via tutorials). To supplement this I have been reading Neural Networks and Deep Learning by Michael Nielsen, a textbook available for free on the internet , which dives into neural networks right from the first chapter. Additionally, I have been finding 3Blue1Brown's multi-part video series about deep learning to be extremely helpful for visualizing some of the more advanced concepts. Even if I do not fully understand the calculus and linear algebra involved, at the very least I have a better idea of what goes on inside of neural networks. For example: I know what loss and gradient descent algorithms do, essentially, and I also understand how the latter helps find a local minimum for the former, but I do not necessarily feel confid