-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CLOUDP-289219: Create CLI commands for manipulating stream processors #3496
base: master
Are you sure you want to change the base?
Conversation
22ef83a
to
e916573
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Next time, it might be a little easier to review if we break down the commits into each command or each command into their own PR :)
Dropping comments because I think I have gone through all of the create
command. I haven't looked but I will try not to leave the same comments on the other commands if they have the same issues. I will continue looking at the other commands and try to only leave net new stuff
* - -o, --output | ||
- string | ||
- false | ||
- Output format. Valid values are json, json-path, go-template, or go-template-file. To see the full output, use the -o json option. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the purpose of outputing a create command? Why would someone want to dump Processor <Name> created.
to a file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can set the output to JSON, i.e. -o json
. I don't think anyone would dump the Processor <Name> created.
to a file, but IMO seeing the JSON output could be valuable as a kind of verification that the correct processor was created (i.e proper name and pipeline)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I think I was confused because the output section here in the docs does not indicate that it prints the processor configuration. It just says Processor <Name> created.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
E2E tests reviewed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[q] I am only seeing this used by changes from 8 months ago. Any ideas why it is being added in your PR/where it is coming from?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch! the e2e test generates this file dynamically here:
mongodb-atlas-cli/test/e2e/atlas/streams_with_clusters_test.go
Lines 128 to 133 in 77b0248
const configFile = "data/create_streams_connection_atlas.json" | |
file, err := os.Create(configFile) | |
if err != nil { | |
return "", err | |
} | |
defer file.Close() |
When I ran the test locally, I forgot to clean it up so it was added to the PR as a new file
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! Should we add it to the .gitignore file for the repo so it doesn't accidentally get added back in the future?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, maybe instead of adding to gitignore we can do a better cleanup within the tests. My guess is that since these E2E tests run in Evergreen, the environment gets wiped at the end of the test so there wasn't much consideration for locally running the tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Delete and Describe reviewed
cmd := &cobra.Command{ | ||
Use: "delete <processorName>", | ||
Short: "Delete a specific Atlas Stream Processor in a Stream Processing Instance.", | ||
Long: fmt.Sprintf(usage.RequiredRole, "Project Read Only"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This role doesn't seem correct.... I don't think you should be able to delete with a read only role.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you're right, I will update the commands with the required roles provided in the API docs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't opened the editor to look for other examples but is there a way to specify two potential roles? I see that everything has been updated to Project Owner but all the commands seem to support Project Owner OR Project Stream Processing Owner roles
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So here's what the RequiredRole is defined as:
mongodb-atlas-cli/internal/usage/usage.go
Line 337 in 77b0248
RequiredRole = "To use this command, you must authenticate with a user account or an API key with the %s role." |
I suppose I could add a new variable
RequiredRoles
that would go something like:
RequiredRoles = "To use this command, you must authenticate with a user account or an API key with the %s or %s role."
I'm a little uncertain about this because what if we add a command that can use one of three roles? or four? Alternatively, we can write a Streams specific variable for usage and it can live in the internal/cli/streams
package or something
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could do something like create a new string format like this:
RequiredOneOfRoles = "To use this command, you must authenticate with a user account or an API key with any of the following roles: %s.";
then use a String.join(", ", ...roleList);
and pass that as the string variable.
"github.com/mongodb/mongodb-atlas-cli/atlascli/internal/mocks" | ||
) | ||
|
||
func TestDeleteOpts_Run(t *testing.T) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There should be a t.Run()
call in this test describing what is being tested/asserted.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nit] t.Run is for test cases, if you don't have test cases please don't add it the name of the Test is what's under test, TestDeleteOpts_Run
tells me is test the method Run from the DeleteOpts struct, editors like intellij will even enable auto navigation between methods and test by following the naming convention for tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I agree with most of what you are saying. The information that I am missing that a t.Run()
(or multiple) could provide is the expected behavior here. When trying to debug why this is failing in the future, I have no context as to what the expected behavior is. I could go through the assertions but it is then not clear if the assertion is no longer valid or the setup.
func DescribeBuilder() *cobra.Command { | ||
opts := &DescribeOpts{} | ||
cmd := &cobra.Command{ | ||
Use: "describe <processorName>", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fwiw, this command seems like it would make more sense to have a json output file that is the .json configuration file that would work with the create command (and eventually update).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm, i think the output of this command would work with create depending on the marshaling (if it ignores all the extra fields it should be fine). only issue is that the API call for creating a stream processor requires a StreamsProcessor
struct, not the StreamsProcessorWithStats
struct, but we need the stats for the describe command
return opts.Print(r) | ||
} | ||
|
||
sp := atlasv2.NewStreamsProcessorWithStats(r.Id, r.Name, r.Pipeline, r.State) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this should be created through a store method. I am not familiar with the decision patterns, I have only seen the CLI interact with the api through stores rather than directly like this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that's a good point, but all I'm trying to do here is create the struct. This doesn't make any API calls, so I'm unsure if it should be a store method. @gssbzn is there a best practice that I should follow here?
mockStore. | ||
EXPECT(). | ||
StreamProcessor(gomock.Any()). | ||
Return(expected, nil). | ||
Times(2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that separate stores should be pulled into each test to more specifically assert each test's code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm, do you mean importing the actual store.ProcessorDescriber
? im not sure how that would work because we need to mock the output of the API calls
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, I am suggestion that we should set up the mockStore in each t.Run()
rather than defining it once outside the t.Run. This is an assertion that is not clearly captured in the individual test code and now we are asserting that it is called twice because we have 2 tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Start and Stop reviewed
|
||
mockStore. | ||
EXPECT(). | ||
StartStreamProcessor(gomock.Any()). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should have stricter assertions on what we pass to the store calls. Splitting the commands out into separate strings might make this easier to assert on then reconstructing Params objects. But it shouldn't be too hard either way.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
im not sure if that's entirely necessary because a lot of times, especially in the case of this test, we won't use those parameters. i might be misunderstanding what you're suggesting though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am suggesting that we should assert that the projectId, tenantName, and processorName that are passed into this store function match the values passed into the cli command (or the expected values). As it stands, this does not verify the inputs do what is expected so if someone were to accidentally hardcode a tenantName in the cli command to accelerate their testing and it was missed during code review then we would ship this without knowing because the test does not assert the inputs to the cli command are passed all the way through to the store.
This PR has gone 30 days without any activity and meets the project’s definition of "stale". This will be auto-closed if there is no new activity over the next 30 days. If the issue is still relevant and active, you can simply comment with a "bump" to keep it open, or add the label "not_stale". Thanks for keeping our repository healthy! |
@auddin431 this is currently mark as stale by our bots, are you still working on this? |
@gssbzn yep im still working on this, was out for a while during holidays and got occupied with other tickets. Will push out requested changes in the next day or two |
e916573
to
c9887a4
Compare
Proposed changes
This work adds the noun
processors
to the top level nounstreams
so that stream processors can be interacted with through the CLI. Theprocessors
noun has the following verbs:Jira ticket: CLOUDP-289219
Checklist
make fmt
and formatted my code