联系方式

  • QQ:821613408
  • 邮箱:[email protected]
  • 工作时间:8:00-23:00
  • 微信:horysk8
  • 您当前位置:首页 >> CS作业CS作业

    日期:2019-06-08 09:09

    ENG 06 Spring 2019 Final Project
    Up to now in ENG6, we have focused on teaching you the “how” of programming. In the team
    project you will be taking your own ideas and bringing them to fruition through your knowledge
    of computer programming. Software development is rarely a one-person effort, so the project
    will be team-based. Teams can be formed with members of any section. You can form your own
    team of three. No other team size will be allowed. Only if strictly needed, the TAs may form
    smaller teams or add members to teams. Beyond the core part of the final project, we ask you
    to implement a number of special features, called Reach elements, so as to make your project
    different from your classmates.
    You will be able to choose between three different project implementations.
    All Graphical User Interface (GUI) implementations must be programmed using App
    Designer. You are not allowed to implement the project using Guide, submissions
    programmed with guide will receive zero points without exception.
    Project 1: Electronic Data Acquisition and Processing using Matlab.
    In this project you will be using an Arduino Uno microprocessor to acquire analog signals from a
    sensor. You may choose any other microprocessor but Matlab commands to control the
    microprocessor must be available on Matlab.. You have the freedom to choose the variable to
    be measured, however, a few possibilities are:
    Temperature using a thermistor
    Light using a photocell or phototransistor
    Acceleration using a gyroscope
    Magnetic field using a Magnetometer
    Humidity using a hydrometer
    A sensor measuring more than one of the above variables!
    Please keep in mind that since this is not a hardware electronics class, your grade will be based
    mostly on the originality of the Graphical User Interface (GUI) implementation, data processing
    and presentation. It will be your responsibility to acquire the microprocessor and associated
    electronics (sensors, resistors, etc.). However, today an Arduino (for example) can be acquired
    for less than $17 and the cost of the sensing electronics is a few dollars as well but you are not
    confined to use an Arduino. What is important in this project is how you process and
    display the data using Matlab.
    Core Requirements: All projects should have a graphical user interface with the following
    capabilities:
    1. A frame with a graph displaying the sensed signal in quasi real time. If you wish you can
    display more than one signal at a time.
    2. A button to record a signal over a predetermined time interval.
    3. A button to display the recorded signal either on a separate frame or on the one used for
    real time display.
    4. An indicator on the control panel, coupled to an LED in the data acquisition board, that
    are ON while the data is being sampled.Special Features: You can choose the special features you decide to implement, however, a
    few examples are:
    Signal noise removal based on taking the average of a predetermined number of data
    sets followed by the display of the raw and averaged data on the GUI panel.
    Finding and displaying signal statistics such as mean, standard deviation, peak
    amplitude of a signal, etc.
    Horizontal and/or vertical cursors controlled by sliders that indicate the x or y coordinate
    of the cursor. Even better, more than one cursors to show the x or y-coordinate
    difference between the two. This can be used, for example, to indicate the width of a
    pulse signal (FWHM) or a 10-90 raise/ fall times.
    Signal reversal, slow or fast motion, signal arithmetic (addition, subtraction) between
    multiple signals.
    Your own ideas….!
    Appendix: Useful information in case you have selected an Arduino.
    In order to be able to send and receive analog and digital data to the Arduino it is necessary to
    install the MATLAB support package for Arduino.
    Instructions for the installation can be found on:
    https://www.mathworks.com/help/supportpkg/arduinoio/ug/intro.htm
    You can check if the support package was installed correctly if you enter:
    >>help writeDigitalPin
    --- help for arduino/writeDigitalPin ---
    Write digital pin value to Arduino hardware.
    Syntax:
    writeDigitalPin(a,pin,value)
    Description:
    Writes specified value to the specified pin on the Arduino hardware.
    Example:
    a = arduino();
    writeDigitalPin(a,'D13',1);
    Input Arguments:
    a - Arduino hardware
    pin - Digital pin number on the Arduino hardware (character vector or string)
    value - Digital value (0, 1) or (true, false) to write to the specified pin (double).
    See also readDigitalPin, writePWMVoltage, writePWMDutyCycle
    A few examples on basic connections to the Arduino can be found on:
    https://www.mathworks.com/help/supportpkg/arduinoio/ug/getting-started-with-matlab-supportpackage-for-arduino-hardware.html
    Instructions to find the Arduino port can be found on:
    https://www.mathworks.com/help/supportpkg/arduinoio/ug/find-arduino-port-on-windows-macand-linux.html
    The following link provides a straightforward way on how to improve the accuracy of Arduino
    signals measured as a function of time. https://www.mathworks.com/videos/log-temperature-data-from-arduino-into-matlab-
    1489428648919.html
    Project 2: Steganography
    Steganography is the practice of concealing a message, image, or file within another
    message, image, or file. The advantage of steganography over cryptography alone is that
    the intended secret message does not attract attention to itself. The general idea is to
    store the hidden image in the red, green, and blue channels of the RGB image without
    changing the pixel color by a perceivable amount. For more information check here:
    http://en.wikipedia.org/wiki/Steganography. In this project you will be embedding at least 10
    secret images simultaneously into an inconspicuous image (The dog image). All the
    images are in the file steganography.zip
    Before starting on the project make sure to read the entire document as Parts 1-4 makes
    references to a GUI that is detailed in Part 5. It is up to you if you create the GUI first or
    last.
    Part 1: Flattening the Images
    1. There is an image dog.png, which you will be concealing at least 10 hidden images in.
    Try viewing it with .
    2. There are ten images labeled hiddenXX.png with XX from 1 to 10. Let’s take the
    grayscale hidden image and make it black and white:
    a. Load one of the hidden images. It should be in a 400x400 integers matrix
    (grayscale).
    b. Create a function to “Flatten” the image by creating a 400x400 matrix with values
    “1” if the corresponding hidden image pixel is dark and “0” if the pixel is light
    (black&white). Your threshold for dark/light is up to you as long as the image is
    still distinguishable after converting from grayscale to black&white.
    c. Create a function to “Expand” the flattened image by taking the 400x400
    flattened image and creating a new RGB image with black pixels for “1”
    elements, and white pixels for “0” elements.
    d. Test by Flattening and Expanding one of the images and making sure it can be
    displayed properly with the command.
    Part 2: Create Embedding Techniques
    As an example for one of the techniques that can be used is the odd/even embedding
    technique
    This is an example one of the different techniques that you will use to hide all the black and
    white images into the concealing image.
    The flattened hidden image is a 400x400x1 matrix of values 0 (white) and 1 (black) pixels. The
    normal image is a 400x400x3 matrix of values 0-255, representing the values of the red, green,
    and blue channels. Our first embedding technique goes as follows:1. Use your “Flattening” function from part 1 to flatten the hidden image selected by the
    user to a flattened hidden image.
    2. If flattened hidden image pixel (x,y) is black, make the red channel of pixel (x,y) in the
    normal image odd valued by either subtracting zero or one.
    3. If flattened hidden image pixel (x,y) is white, make pixel (x,y) on the red channel of the
    normal image even valued by either subtracting zero or one.
    In your GUI call this method the “Odd/Even Red Embedding”.
    Part 3: Create Recovery Technique
    Note that for every embedding technique created in part 2, an appropriate recovery technique
    has to be created. As an example, we show you the recovery technique for the odd/even
    embedding technique. To recover the hidden image back from our embedded image we can
    perform the following:
    1. Create a blank matrix of size 400x400. Let’s call it our recovered image.
    2. If the embedded image red channel pixel (x,y) is odd valued, set recovered image pixel
    (x,y) to 1. Otherwise set it to 0.
    3. Use your “Expand” function from part 1 to convert the 400x400x1 recovered image into
    a RGB so it can be displayed in your GUI.
    In your GUI call this recovery method “Odd/Even Red Recovery”.
    Part 4: Five Additional Embedding Techniques
    Here is where your creativity comes in. We have gone through one method of embedding a
    hidden black and white image within a RGB image. You have a total of 10 images you need to
    embed anyway you would like. Here are some suggestions:
    Remember you have a red/green/blue channel so you can embed and image per
    channel per embedding technique (i.e the odd/even technique can save 3 images in the
    red, green, blue channel respectively).
    You will find the MOD function important. Think about how you can use the modulus of
    the pixel values in a similar fashion to how we used the odd/even-ness of the values.
    The idea is that whatever operation you apply to the pixels of the normal image, there is
    a way of identifying what was done. In the case of the odd/even method we can identify
    if a pixel is odd or even. We could have also done things like set the last digit of every
    pixel to be 3.
    Remember your methods need to work together nicely. If you apply method X then
    method Y, method Y should not interfere with the recovery using method X!
    Part 5: Create a User Interface
    Create a GUI which meets the following criteria:
    1. Allows the user to load a normal image (you are given an image of a dog but any
    400x400 pixel png image should work). When the user chooses this option, the image
    should appear.
    2. Allow the user to pick one or more of the ten hidden images to embed in the normal image.
    3. Create buttons/list to allow the user to select which of the hiding techniques to apply to
    the images needed to be hidden.
    4. Create a button to save the resultant image after embedding the hidden message(s).
    5. The GUI should allow loading an embedded image and extraction of hidden images
    using each of the methods.
    6. The more extra features your group adds to the GUI, the more points you will get for
    creativity. Below is an example of a GUI layout you could make for the project:Project 3: Audio Sampler
    In this project, your group will be required to program a basic audio sampler Graphical User
    Interface. It will model some of today’s commercial hardware based audio samplers from
    companies like Akai, Native Instruments, Roland, etc.
    As described in Mathworks Documentation:
    “The audio signal in a file represents a series of samples that capture the amplitude of
    the sound over time. The sample rate is the number of discrete samples taken per
    second and given in hertz. The precision of the samples, measured by the bit depth
    (number of bits per sample), depends on the available audio hardware. MATLAB? audio
    functions read and store single-channel (mono) audio data in an m-by-1 column vector,
    and stereo data in an m-by-2 matrix. In either case, m is the number of samples. For
    stereo data, the first column contains the left channel, and the second column contains
    the right channel.”
    Given this information, we can manipulate audio easily since it is contained in arrays. We can
    choose the sample size, sample rate and manually chop up the audio section by section
    according to user defined input. This forms the basis of our audio sampler.
    Here are examples of commercial audio samplers (hardware and software):
    SONiVOX Sampla Software Based Sampler
    Native Instruments Maschine Hardware/Software Sampler
    Akai MPD32 Hardware Based Sampler
    Keep in mind these examples and others can be used as a reference for what features you
    would like to add to your sampler in order to be more creative. You will not be expected to
    produce a commercially ready product, however, you will be expected to implement some of the
    core functionality that these types of samplers include.
    Functionality Requirements
    1.) Basic Audio File Input
    ● The user will be able to load various samples, or .wav files into the program for
    playback. The user must be able to load the samples into the program and
    associate the samples with buttons that, when pressed, provide audio playback.
    ● A minimum of 9 buttons (3 x 3 grid) must be implemented. If your group is feeling
    more ambitious, there is no limit to the grid sizing you can use (4 x 4, 5 x 5, 6 x 6
    etc.)
    ● The loaded samples will play back when the individual grid sample buttons are
    pushed, NOT a single play button for all the samples. The point is to be able to
    play each loaded sample by its button in real time, very much like an electronic
    drum or instrument rack/pads.2.) Effects and Sample Modification
    ● The user should have the option of editing the individual samples that have been
    loaded through use of an interactive menu.
    ● The user should be able to add audio effects to each sample. These audio
    effects can be implemented by altering the sound data arrays. For example, in
    order to reverse the sample, one would simply flip the array backwards.
    Required Effects Include:
    a.) Sample reversal
    b.) Delay
    c.) Tone Control (Filtering)
    d.) Speed up
    e.) Voice removal
    Here is a reference to help you get started on basic audio file manipulation:
    http://homepages.udayton.edu/~hardierc/ece203/sound.htm
    3.)Chopping
    ● The user should have the ability to choose a loaded sample, and chop/edit the
    length of the sample.
    ● They should be able to pick the new start and stop times of the editable sample.
    This can be accomplished by editing the size of the array.
    4.) Basic Tone Generation (Synthesizer)
    ● User should be able to load into a sample box a pure tone generated through a
    mathematical function.
    ● Creativity points will be assigned to students that can generate more than just a
    basic sine wave, like a square wave, triangle wave, amplitude modulation, etc.
    ● For the basic sine tone, the user must be able to select the frequency or pitch of
    the tone. This can be done using an interactive keyboard or switchboard/drop
    down menu.Project Deadlines:
    Deadline #1: Wednesday, May 15, 5:00 pm: A team member must submit your team names to
    a Google Docs Form. We will post a Canvas announcement once the form is ready for
    submissions. Only one team member should do this!
    Deadline #2: Saturday, May 18, 5:00 pm: Submit proposal.
    Deadline #3: Friday, June 7, 5:00 pm: Each team will submit all relevant coding files, a link to
    Youtube video and team evaluation materials. Each team will submit a zip file of all the code,
    the zip file will have all .m files, all .img files and any other files that are needed for the game to
    run. The zip file should also contain a .pdf file of the code, as well as a PDF of the team
    evaluation document. The link of the Youtube should be accessible to all those who use the link.
    Collaboration Policy: Once teams are formed you are only allowed to talk and collaborate
    with members within your team. Team members are expected to equally participate, and
    collaboratively work towards the completion of the project. Other than contacting the teaching
    assistants for clarification, you may not seek the assistance of other persons to complete your
    team's project. (Of course, general discussions about how to implement GUI, OOP and other
    programming constructs could be discussed with other students, but your team must be totally
    responsible for the implementation of code used in your project).
    Grading Criteria: The projects are open ended. As long as your program can perform the
    assigned tasks, there will be no correct or incorrect approaches. Certainly there will be more
    acceptable and attractive solutions, and that will be judged in comparison with competing
    solutions submitted by your classmates. The final project will be graded in five parts:
    1. Project proposal: Each team submits a 2-3 page via Canvas a project proposal
    describing the project they have selected, a general description of how you will implement
    the main components of your project and a clear description of the Reach features that your
    team proposes. Essentially, the scope of the project should be challenging enough to merit
    full credit and doable within the timeline. An Appendix should contain a breakdown of
    programming tasks, and who will be responsible for what, along with a timeline that will
    meet the submission deadline (suggest you make use of a Gannt chart (Links to an
    external site.) .. The expectation is that each team member must take responsibility for a
    specific aspect of the project and grading for each member will be adjusted according to
    how the project tasks were delegated and who was responsible for what aspects of the
    project. The more specific you can be in defining the programming tasks, what functions
    should exist, and what each function should accomplish, the better. For the data acquisition
    project using Arduino it is not allowed to have a team member working solely on the
    electronic implementation, i.e. all team members must have MATLAB coding
    responsibilities.
    2. Core: Complete the basic project as outlined in the project specification.
    3. Special Features: A significant component of the project grade will depend on the
    special features your group decides to implement. The main goal of this portion of the
    project is for you to show your data processing skills. Implement the project extensions
    described in your proposal. Your completion of the Core and the difficulty of your proposal
    will be taken into account during the grading process.4. Youtube Video Requirements: Youtube has several examples of ENG6 videos (search
    ENG6). The format of the video is entirely up to your team as long as the following criteria
    are met:
    a. Maximum length of the video is 10 minutes
    b. Each team member must be seen in the video to present their work and contributions
    c. A clear and easy to follow demonstration that shows the correct functionality of your
    program (show your program actually working in the video – not screen shots of before
    and after.)
    d. In your YouTube video, please point out how you implemented some features
    (especially in the Core and the Special Features) inside your code. What functions
    did you use? Did you use any data structures such as structs etc.? What was
    challenging about implementing a certain feature and why?
    e. Use visual aides to help explain your steps (whiteboard, markers, poster, etc.).
    The video does not have to be fancy, just effective in relaying the most important
    information.
    5. Team Evaluations: Each member must provide a brief personal summary of her/his
    involvement and contributions. Each team member is required to submit evaluations of your
    and your teammates’ contribution, one for each of Core and Reach. For example, if your
    team has members A, B, C, your evaluation can be similar to the following for a single
    member. An example is shown below.
    Team Member A: was in charge of writing the code to execute the equalizer filters. For the
    Reach, A was in charge of adding 2 different analysis plots that could show power spectral
    density plot and frequency content of audio file. Team Members B, C agree that A performed
    these tasks for the project.

    [email protected]

    版权所有:留学生编程辅导网 2020 All Rights Reserved 联系方式:QQ:821613408 微信:horysk8 电子信箱:[email protected]
    免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。 站长地图

    python代写
    微信客服:horysk8