I am glad to say that I was able to find a solution to the challenges that I described in my previous blog entry. This week was quite successful in that I was able to complete all of my work before expected. I completed the application I was working on, added additional functionality to it and optimized it to run much faster for larger data sizes. My biggest obstacle this week was the lack of proper communication with my mentor. On some days I would run into a roadblock that I didn't have the capability to solve. Such roadblocks were associated with the code that my mentor had provided to me. I understand that some of my work is most likely low priority. However, I do hope that email response times can increase between both parties. There were many hours this week that were wasted on unproductive activities.
I added 3D aperture measurement functionality to my application. Essentially this would allow a user to calculate the 3D aperture (opening) of a rough fracture with the push of a button. This functionality comes on top of the 1D aperture measurement that was already incorporated into the application. I altered the functions for taking the measurement such that the output would be in an excel file. This alteration opened another can of worms. Writing to an excel file is a slow task, and the code provided to me by my mentor had a series of write-to-excel commands that were individually writing 1x1 arrays of data. I optimized this to first create a mass array of all the data and then write all of it together to an excel file. This sped up execution time considerably.
My work due today (Friday) has hit a slowdown, much to my disappointment. Now that I have created all the tools needed to output measurements, I can actually start analyzing data. The problem that I encountered is that the data that I am trying to process is simply too large for MATLAB and my work computer to handle. Previously, I was working with a sample 64x64x1000 array. My computer was able to handle operation involving this array quite easily. For my assignment I need to process a 1024x1024x400 array, a size that my computer cannot handle. I hope that I can run this application on Stampede, otherwise I might need to upgrade my work computer.
I added 3D aperture measurement functionality to my application. Essentially this would allow a user to calculate the 3D aperture (opening) of a rough fracture with the push of a button. This functionality comes on top of the 1D aperture measurement that was already incorporated into the application. I altered the functions for taking the measurement such that the output would be in an excel file. This alteration opened another can of worms. Writing to an excel file is a slow task, and the code provided to me by my mentor had a series of write-to-excel commands that were individually writing 1x1 arrays of data. I optimized this to first create a mass array of all the data and then write all of it together to an excel file. This sped up execution time considerably.
My work due today (Friday) has hit a slowdown, much to my disappointment. Now that I have created all the tools needed to output measurements, I can actually start analyzing data. The problem that I encountered is that the data that I am trying to process is simply too large for MATLAB and my work computer to handle. Previously, I was working with a sample 64x64x1000 array. My computer was able to handle operation involving this array quite easily. For my assignment I need to process a 1024x1024x400 array, a size that my computer cannot handle. I hope that I can run this application on Stampede, otherwise I might need to upgrade my work computer.