The Positive Rail
  • Blog
  • About
  • Contact

Spark Core toolchain for Mac OSX

1/31/2015

1 Comment

 
I've begun working with a great product - the Spark Core from Spark.io, which is a development board/module that integrates the TI CC3000 wifi module and an STM32F103 controller.  Spark.io provides a complete open source firmware to drive the system.  

They are continuing to bring out new versions of their product - like the Photon (due in March) which uses a Broadcom chipset which can serve as a software access point - important if you want to serve applications from your module without it being connected to a local wifi network and Internet.  By developing against the Spark Core I should be able to transfer to this more advanced hardware as it comes available.

Their system is great for getting started with wireless modular computing.  The Spark Core comes preloaded with firmware that allows it to receive downloads wirelessly, and they provide a complete Web IDE for programming.  

The problem I need to solve is that I want faster download and debugging.  The current download/reset cycle via WiFi is fairly slow; there is no debugger.  There is a tutorial in Spark.io support for setting up a local development and debugging system on Windows.  But I am on a Mac.  How do I get my toolchain to work with the Spark Core?
I'm using the Yagarto (Yet Another GNU Arm TOolchain) and the Segger J-link debug probe (available on Adafruit).  
Here are the steps for installing Yagarto:

1) Install the arm-none-eabi-gcc compiler and arm-none-eabi-gdb debugger tools.  
  • Download the the Yagarto 'tarball for Mac' (located on the right side of the page) from launchpad.net
  • Expand the tarball and move the folder to a directory where you keep your development tools (e.g. ~/myusername/devtools)
  • Add the path to the /bin directory in the Yagarto tools directory (where arm-none-eabi-gcc and arm-none-eabi-gdb, among other tools, are kept).  To do this, add the following line to the .profile file in your home directory:
 export PATH="$PATH:$HOME/devtools/gcc-arm-none-eabi-4_9-2014q4/bin" 
  • Check that this works by typing 'arm-none-eabi-gcc' in your home directory (or any other directory besides the /bin directory) - you should get a message 
 arm-none-eabi-gcc: fatal error: no input files compilation terminated. 
  • This indicates that the arm-none-eabi-gcc executable was found and ran, but could not find a makefile (which specifies what to compile).  You should also be able to start the debugger by typing 'arm-none-eabi-gdb'.  
  • If things are still not working, type 'echo $PATH' to see the actual path that the shell is searching; you should get something that looks like:
 /usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/opt/X11/bin: /Users/myuserid/devtools/gcc-arm-none-eabi-4_9-2014q4/bin 
  • The various paths are separated by colons; the last one should be the path to your Yargato bin directory  If it is not there then try logging out and logging back in to have the shell read your '.profile' file.  If it is there then try cd'ing to that path.  If you are trying to use a tilde (~) (which is a shortcut for your home directory (i.e. '/Users/myuserid')) in the $PATH environment variable it may fail because the shell needs a full directory specification. 
2) I am using the Segger J-link debug probe, which I wrote about in this post.  
  • This is the hardware that interfaces with the Spark Core.  It is a USB box with some pins that connect to the debug pins on the Spark Core.  
  • Segger provides a driver that makes it look like a standard GDB (GNU debug) server.  Programs like Eclipse (or as shown below, the arm-none-eabi-gdb program) can then interface with the GDB server and send commands to start and stop the CPU, load code, set breakpoints, etc.  
  • Another probe that should work is the STLink V2 - but I'm not sure if there is a Mac OSX GDB server available.

3) Connect the SWD (single wire debug) pins of the Spark Core to the debugger. 
  • I am using a 9pin SWD debugger adapter for the JLink to adapt the 20 pin port to a 0.05" format because I have other boards (like the KL26Z and my own designs) that use the 10pin (9 if you don't include pin 7 which is not connected) SWD port format.  I put together a small cable adapter that breaks it out.
  • If you don't have the 9pin But you can also connect your jumper wires directly to the 20 pins on theJLink itself - you can infer the pinouts for the 20 pin debugger adapter (on the same page as the 9pin SWD debugger adapter).
  • Connect the V3v3, GND, Reset to corresponding pins in the SWD debugger port (for the 9pin adapter these are 1,3 and 10, respectively; on the 20 pin header these are 1, 4 and 15)
  • Connect pin D7 to the SWD-DIO and D6 to the SWD-CLK (for the 9pin adapter these are pins 2 and 4, respectively; on the 20 pin header these are 7 and 9, respectively).
  • If you need JTAG debugging (SWD is sufficient for most core level debugging; JTAG can be used to check pins and hardware) connect the other pins on the JLink to the corresponding.
  • Check that the debug probe can detect the Spark Core by running JLinkExe (the general command line tool for the J-Link; later on we will be using JLinkGDB, the GDB server which converts GDB commands into specific instructions for the chip sent over the SWD port:
 JLinkExe SEGGER J-Link Commander V4.96c ('?' for help)  Compiled Jan 28 2015 19:28:07  DLL version V4.96c, compiled Jan 28 2015 19:28:00  Firmware: J-Link V9 compiled Jan 27 2015 18:19:29  Hardware: V9.30  S/N: 269302200   OEM: SEGGER-EDU  Feature(s): FlashBP, GDB  VTarget = 3.282V Info: Found SWD-DP with ID 0x1BA01477 Info: Found Cortex-M3 r1p1, Little endian. Info: FPUnit: 6 code (BP) slots and 2 literal slots Info: TPIU fitted. Found 1 JTAG device, Total IRLen = 4: Cortex-M3 identified. Target interface speed: 100 kHz J-Link> 
4) Start the JLinkGDB server:
  • Important - to get JLinkGDB to properly recognize the chip you have to tell it to both search only for SWD and specify that you are looking for a specific chip (in this case the STMF32103CB, which is the one used in the Spark Core).  You do this by including the -if and -device flags, that is:
 jlinkgdbserver -if SWD -device STM32F103CB SEGGER J-Link GDB Server V4.96c Command Line Version  JLinkARM.dll V4.96c (DLL compiled Jan 28 2015 19:28:00)  -----GDB Server start settings----- GDBInit file:                  none GDB Server Listening port:     2331 SWO raw output listening port: 2332 Terminal I/O port:             2333 Accept remote connection:      yes Generate logfile:              off Verify download:               off Init regs on start:            on Silent mode:                   off Single run mode:               off Target connection timeout:     0 ms ------J-Link related settings------ J-Link Host interface:         USB J-Link script:                 none J-Link settings file:          none ------Target related settings------ Target device:                 STM32F103CB Target interface:              SWD Target interface speed:        1000kHz Target endian:                 little  Connecting to J-Link... J-Link is connected. Firmware: J-Link V9 compiled Jan 27 2015 18:19:29 Hardware: V9.30 S/N: 269302200 OEM: SEGGER-EDU Feature(s): FlashBP, GDB Checking target voltage... Target voltage: 3.28 V Listening on TCP/IP port 2331 Connecting to target...Connected to target Waiting for GDB connection... 
  • You can press Control-C to exit the GDB server (it may take a few seconds).  If the target voltage is not 3.6-4.2V, you may not have connected the hardware correctly.  Note that it is 'listening' on port 2331 - later on you will connect to this port (the full address is localhost:2331) when running arm-none-eabi-gdb (the GDB client).

5) Install the Spark.io Spark Core firmware and test the compile
  • Follow the instructions in the Spark.io Community pages on replicating the Spark firmware (core-firmware, core-common-lib and core-communications-lib) into your local respositories.
  • Compile the default firmware by executing 'make' in the build directory.  It should build without any errors.  If there are errors, you should be sure that you've installed all three parts of the firmware - core-firmware, core-common-lib, and core-communication-lib
  • Compile a version with the SWD and JTAG pins enabled, and debug build enabled.  You want to force the compiler to recompile everything ('make clean all') and add the flags USE_SWD_JTAG=y and DEBUG_BUILD=ymake clean all USE_SWD_JTAG=y DEBUG_BUILD=y, e.g. 
 make clean all USE_SWD_JTAG=y DEBUG_BUILD=y 
  • There should be a bunch of compiling messages - the most important things to check are that the '-DUSE_SWD_JTAG and -DDEBUG-BUILD flags are being sent to the compiler (you should see these in the compile command for each file).  The messages should end with some final commands indicating that the .elf, .bin and .hex files are made and the size of the .elf file:
 Invoking: ARM GNU Create Flash Image arm-none-eabi-objcopy -O binary core-firmware.elf core-firmware.bin  Invoking: ARM GNU Create Flash Image arm-none-eabi-objcopy -O ihex core-firmware.elf core-firmware.hex  Invoking: ARM GNU Print Size arm-none-eabi-size --format=berkeley core-firmware.elf    text    data     bss     dec     hex filename   81556    1224   11864   94644   171b4 core-firmware.elf 
  • If there are errors, check your paths, and make sure you've installed all three parts of the core firmware.
6) Put the Spark Core into DFU mode (device firmware update)
  • We want the Spark Core in DFU mode because in this mode it listens to the SWD/JTAG pins that our debug probe is connected to.  In the normal operating mode these pins are reassigned to GPIO so it will not receive the debug signals to halt, load data, etc
  • In the Windows tutorial, the DFU mode was also used to download code to the core.  Here we will be using the SWD port directly to download code.
  • To get into DFU mode, hold down the mode button (left button if the USB port is on the top), pressing the reset button briefly, and releasing the mode button when the LED begins blinking yellow. 
6) Start the GDB client with the core-firmware.elf file and connect to the debugger
  • Start the debugger by going to the /build directory of the firmware and typing 'arm-none-eabi-gdb core-firmware.elf'.   This starts the debugger and tells that you are using the core-firmware.elf program - it will then also load a map of the symbols (functions, variables, memory).  You should get something that looks like this:
 
arm-none-eabi-gdb core-firmware.elf GNU gdb (GNU Tools for ARM Embedded Processors) 7.8.1.20141128-cvs Copyright (C) 2014 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html> This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Type "show copying" and "show warranty" for details. This GDB was configured as "--host=x86_64-apple-darwin10 --target=arm-none-eabi". Type "show configuration" for configuration details. For bug reporting instructions, please see: <http://www.gnu.org/software/gdb/bugs/>. Find the GDB manual and other documentation resources online at: <http://www.gnu.org/software/gdb/documentation/>. For help, type "help". Type "apropos word" to search for commands related to "word"... Reading symbols from core-firmware.elf...done. (gdb)
  • Connect to the GDB server (which you should have running in the background - if not, start it now in a separate window) by typing 'target extended-remote localhost:2331'. 
 (gdb) target extended-remote localhost:2331 Remote debugging using localhost:2331 0x0800010c in ?? () (gdb)  
  • The 0x0800010c is the current memory location that the CPU is executing from.  If you are getting 0x00000000 as the memory location, 
  • Load your firmware onto the Spark Core by typing 'load'
 (gdb) load Loading section .isr_vector, size 0x10c lma 0x8005000 Loading section .text, size 0x13d88 lma 0x8005110 Loading section .init_array, size 0x58 lma 0x8018e98 Loading section .data, size 0x470 lma 0x8018ef0 Start address 0x8005110, load size 82780 Transfer rate: 3674 KB/sec, 3599 bytes/write. (gdb)  
  • If your transfer rate is greater than about 3700 KB/sec, your core may not be responding to the programming.  In this case try power-cycling the core and putting it back into DFU mode.
  • Reset the Spark Core by typing 'monitor reset'
  • Run your firmware by typing 'continue' (or c) for short!  Your Spark Core should go through its standard reset behavior and eventually start the magenta breathing pattern if you are hooked up to a WiFi network.
  • You should be able to hit 'control-c' (which pauses the program).  The Spark Core should stop its 'breathing' pattern and you should get something that looks like this:
 ^C Program received signal SIGTRAP, Trace/breakpoint trap. hci_event_handler (pRetParams=pRetParams@entry=0x20004f80,      from=from@entry=0x0, fromlen=fromlen@entry=0x0)     at ../CC3000_Host_Driver/evnt_handler.c:250 250                             volatile system_tick_t now = GetSystem1MsTick(); (gdb) 
  • If the Spark Core does not stop, either you have loaded a firmware that does not have the SWD pins enabled or the debugger is set incorrectly.
  • You should be able to restart the core by typing 'continue' and it should resume breathing or blinking.
  • Troubleshooting: occasionally it seems that the Spark Core gets into a state where it is not responsive to the debugger (you'll see this in the window where you're running JLinkGDB as "WARNING: Failed to read memory at address x").  In this case try power-cycling the Spark Core.
  • To quit from GDB press 'control q'

Once you have a functioning toolchain and debugger, it should be very fast to compile your code and upload it.  I'm still working out this part but what I understand so far is:
  • Applications should live in 'core-firmware/applications' directory.  
  • To compile an application, add 'APP=applicationname' to the make command line - the makefile will then search for that directory.
  • Don't forget to include the USE_SWD_JTAG=y and DEBUG_BUILD=y in the make command line if you want to be able to debug it.  Use make clean all if you are changing from a non-debug to debug version (you need to force the compile to rebuild all of the dependencies).

This is a first pass at this procedure.  Please let me know if I missed any steps, or if you've discovered a way to make this simpler or more robust.  Happy Spark Core debugging on Mac OS X!

1 Comment
Andrew Sims link
10/14/2022 12:32:30 am

Standard available yet chance husband stuff cultural. Table worker better artist fight world room.

Reply



Leave a Reply.

    What is this about?

    Embedded systems, nascent markets and minimalist product development

    Archives

    September 2014
    August 2014
    July 2014
    June 2014

    Categories

    All

    RSS Feed