-
I'm trying to make a simple demonstration of animation using the canvas, as had previously been suggested. However, it seems that what I want to do would require a render loop or perhaps setting something else. The documentation is lacking on this. If it's possible to make an example of just alternating the color of a canvas over time, I can probably work with that. but if I had some basic example ; such as this webgpu example but just using a canvas: https://github.com/cogentcore/webgpu/tree/main/examples/cube That would be even better. I don't need textures at this point. What I aim to eventually do (among other things) is a canvas animation like this: https://0magnet.github.io/wasm-stuff/ And I have an example here of an audio spectrogram display - implemented with 3 GUIs (gomobile, fyne, and wasm) and one TUI (tcell) - which I would really like to try to add an implementation that uses cogentcore: |
Beta Was this translation helpful? Give feedback.
Replies: 9 comments 9 replies
-
Here is a simple example of a canvas animation that I have added to the documentation in #1321: t := 0
c := core.NewCanvas(b).SetDraw(func(pc *paint.Context) {
pc.DrawCircle(0.5, 0.5, float32(t%60)/120)
pc.FillStyle.Color = colors.Scheme.Success.Base
pc.Fill()
})
go func() {
for range time.Tick(time.Second/60) {
c.NeedsRender()
t++
}
}() That makes a circle with a changing size based on a time variable incremented in a render loop. If that example is not sufficient, I can definitely help you with the use cases you mentioned; they should absolutely be possible. I am improving the documentation for this in #1321 as mentioned above, and I will continue to work on the documentation. |
Beta Was this translation helpful? Give feedback.
-
Fantastic. cogentcore-globe-2024-12-30_17.06.09.mp4Here's the code. Something like this in the canvas documentation / examples would be really eye catching.
|
Beta Was this translation helpful? Give feedback.
-
In another globe I have which compiles directly to webassembly, I'm using frag shader code and vert shader code like this
It basically just gives a red to blue color gradient to make the rotation of the globe more apparent. How can this kind of thing be implemented using cogentcore ; in the code I posted in the previous reply? ... I should probably also take the opportunity here to ask about mouse interactions with the canvas. Not that I'm needing to do that, currently, but to know if it's possible would be appreciated. If it was possible, to grab the globe animation with the mouse at a certain point, and be able to drag it or spin it along whichever axis of rotation - that would be an interesting / fun example to interact with, at least. Also if you can have animated effects from the mouse moving over the canvas, those tend to make really eye-catching examples as well. ... Is it possible to play sound via a cogentcore application? Or is it simple enough to do that cross-platform already. I notice the video example, I assume it might be possible to do it via embedded audio files using the same method. ... Another related question ; when is webgpu anticipated to be widely supported, or at least supported on linux? I've attempted to get some of those examples working in waterfox and brave browser, without success - regardless of the flags used on the browser which supposedly enable experimental support. |
Beta Was this translation helpful? Give feedback.
-
After additional testing, I note that the web / wasm view of the same (60FPS) app fails to load. If I reduce the frame rate to 30 frames per second, it works. Any higher and it just gets stuck on the loading / preview. CPU usage is a bit higher for both web and desktop views than with other GUI frameworks I've used - at high frame rates. I also note that in my case - with onboard graphics - the desktop version uses CPU directly in proportion to the size of the window, regardless of the size of the animation in the window. I also experimented with adding the animation code to the live view of the canvas documentation here, since it's editable https://www.cogentcore.org/core/widgets/media/canvases The performance was lacking when I added the animation there. It seems that the more stuff on a page, the slower any animation is because (I guess) the whole page is being re-rendered at a certain point. That is the only way I can think of to explain it. After adding and removing the animation there, it seems that the CPU usage stayed higher than it should be. That might merit investigation ; but I'm unsure such an issue would impact most people, as the setup for a live-editable code is not very common. It may be worthwhile to handle such a case differently than it is currently handled, as it can cause the page to become unresponsive when the animation should be visible. |
Beta Was this translation helpful? Give feedback.
-
Overall it's very usable, and relatively easy to use compared to other gui frameworks. I don't think I've ever seen a globe animation in golang done with less code than the above implementation using core libraries. This is only lacking a bit of optimization in some areas, to echo your sentiments. I'm excited to witness the development, and I'm glad to provide constructive feedback on any issues I encounter ; and sincerely appreciative of your advice, guidance, your kind and thoughtful regards. I would have never imagined it would be possible, in the first place, to compile the same golang GUI app to desktop, mobile, and web - before I knew about |
Beta Was this translation helpful? Give feedback.
-
I suspect web workers are needed to scale out the work ? |
Beta Was this translation helpful? Give feedback.
-
I'm making some progress towards a spectrogram display using cogentcore libraries ; I just wanted to post this intermediate code which generates a color gradient for reference of others, or to potentially improve the documentation, as it took some time for me to figure out how to do a color gradient on the canvas like this
|
Beta Was this translation helpful? Give feedback.
-
After figuring out setting pixels to the desired color, the implementation of the spectrogram display was not too difficult - and I should mention that it's the smallest codebase of any of the 4 other UIs with which I've implemented this.
CORE-spectrogram-2025-01-17_13.44.19.mp4The caterpillar-like movement of the display is just due to the asynchronous updating with I know you mentioned in #1416 that you would eventually add ... I've updated my project repo with the core option for the desktop gui here: https://github.com/0magnet/audioprism-go Sortof on that note, the current wasm version of this program does not use CORE and is compiled with tinygo; and that is actually embedded in the golang binary. Yet the wasm version of the core application will likely dwarf the size of the current audioprism binary, and bloat the current one to many times the size it was before, if included in the same way. It's unlikely to matter very much for my particular use case, but I just mention this because sometimes it's convenient to embed the compiled wasm client code when client and server codebase are separate, but the size increase of doing that with CORE is substantial compared to some other frameworks. Which, you already know. I'm not sure what the status of core working with tinygo is, though I've seen it mentioned a few times. Information on that would also be appreciated. |
Beta Was this translation helpful? Give feedback.
-
Good discussion about perf. I have been using cogent to do play with 2d, 3d rendering and compositing of these together to do 4d ( like a video editor , but a rendering editor essentially over time ). I also use threejs etc too , because it has a mature architecture that worked out how to get perf on the web, via hard won trial and error. maybe some of those constructs are needed in cogent . in the end you can’t get perf , and lively UX , without some operation of computation at a formal level ? in relation to web perf , web workers are your “threads”. Thos is a simple example outputting to the console . https://webgpu.github.io/webgpu-samples/?sample=worker https://developer.mozilla.org/en-US/docs/Web/API/HTMLCanvasElement/transferControlToOffscreen Is the means to do this . But I think it’s meant for 2d. Not sure as it’s new to me. https://web.dev/articles/offscreen-canvas Has demo and code and explains how it keeps the UX event loop alive. So, if this is the way, In order to have an isomorphic architecture the concept of workers needs to be part of web and native. https://github.com/magodo/go-wasmww is a simple golang /wasm example of workers for web that formalises this multi tiered architecture. the golang compiler can’t do this . It has to be designed into the cogent architecture obviously . But it needs a shim so that both web and native can compile and run off the same code . The golang way to do this is build tags , rather than runtime reflection. Compositing in real time is the way to use webgpu workers. in the web up to 32 canvas are available ( like time I checked ) , with the main “ worker “ compositing the workers buffers together. the web way of doing compositing is https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/SharedArrayBuffer which is in all browsers . Correction. A shared array is what’s used with an offscreen canvas, because it works with a TransferableObject. This all leads back to isomorphic architecture. The native and web versions need this structured stack formalised in order to get serious perf , with a way to pass the data around or a reference ( aka shared buffer ). in the end your back to a queue and workers like we do for large scale computing on servers but applied to GPU. This is why I call it Compositing , which is the name for this in the graphics domain . |
Beta Was this translation helpful? Give feedback.
Here is a simple example of a canvas animation that I have added to the documentation in #1321:
That makes a circle with a changing size based on a time variable incremented in a render loop. If that example is not sufficient, I can definitely help you with the use cases you mentioned; they should absolutely be possible. I am improving the documentation for this in #1321 as mentioned above, and I will continue to …